专利摘要:
method, computer and system readable storage memory. the present invention relates to transverse swipe gestures for touch screens. in at least some embodiments, cross-swipe gestures can be used over content that pans or scrolls in one direction, to allow additional actions, such as content selection (110), drag-and-drop operations, and the like. in one or more modalities, a transverse swipe gesture can be performed by dragging an item or object (310, 312, 314, 316, 318, 320, 322, 324, 326, 328) in a direction that is different from a direction of Scrolling. drag from different direction can be mapped to additional actions or functionality. in one or more modalities, one or more limits can be used, such as a distance limit, in combination with different direction drag, to map to additional actions or functionality.
公开号:BR112014002379B1
申请号:R112014002379-4
申请日:2012-07-17
公开日:2021-07-27
发明作者:Jan-Kristian Markiewicz;Gerrit H. Hofmeester;Orry W. Soegiono;Jon Gabriel Clapper;Jennifer Marie Wolfe;Chantal M. Leonard;Theresa B. Pittappilly;Holger Kuehnle;John C. Whytock
申请人:Microsoft Technology Licensing, Llc;
IPC主号:
专利说明:

Background
[0001] The present invention refers to one of the challenges that continue to face the designers of devices that have user-dockable screens, such as touch screens, which refers to providing an improved functionality for users, through gestures that can be employed with the devices. This is so, not only with devices that have larger or multiple screens, but also in the context of devices that have a smaller footprint, such as tablet PCs, handheld devices, smaller multi-screen devices and the like.
[0002] A challenge with gesture-based input is that of providing secondary actions. For example, in current touch interfaces, it is common to tap on the item to launch the item. This makes it difficult to provide secondary functionality such as the ability to select items. Still, certain challenges exist with the so-called panoramic surfaces, that is, surfaces that can be panned and have their content moved. For example, a panoramicable surface typically reacts to a finger drag and moves content toward the user's finger. If the surface contains objects that a user might want to rearrange, it is difficult to differentiate when the user wants to pan the surface or rearrange the content. summary
[0003] This Summary is provided to introduce a selection of concepts in a simplified form that are further described below in the Detailed Description. This Summary is not intended to identify key feature or essential features of the claimed subject matter nor is it intended to be used as an aid in determining the scope of the claimed subject matter.
[0004] Cross-sliding gestures for touch screens are described. In at least some embodiments, cross-swipe gestures can be used over content that pans or scrolls in one direction, to allow for additional actions, such as content selection, drag-and-drop operations, and the like.
[0005] In one or more modalities, a transverse swipe gesture can be performed by dragging an item or object in a direction that is different from a pan or scroll direction. Drag from different direction can be mapped to additional actions or functionality. In one or more modalities, one or more limits can be used, such as a distance limit, in combination with different direction drag, to map to additional actions or functionality.
[0006] In at least some modalities, so-called speed reducers, or other perceptible indices such as visual indices, can be used to provide a user with an understanding or perception of limits. Brief Description of Drawings
[0007] The detailed description is described with reference to the accompanying figures. In figures, the leftmost digit(s) of a reference number indicates the figure in which the reference number first appears. The use of the same reference numbers in different offices in the description and figures may indicate similar or identical items.
[0008] Figure 1 is an illustration of an environment in an exemplary implementation according to one or more modalities.
[0009] Figure 2 is an illustration of a system in an exemplary implementation that shows Figure 1 in greater detail.
[00010] Figure 3 illustrates an exemplary computing device according to one or more modalities.
[00011] Figure 4 illustrates an exemplary computing device according to one or more modalities.
[00012] Figure 5 is a flowchart describing steps in a method according to one or more modalities.
[00013] Figure 6 illustrates an exemplary computing device according to one or more modalities.
[00014] Figure 7 illustrates an exemplary computing device according to one or more modalities.
[00015] Figure 8 is a flowchart describing steps in a method according to one or more modalities.
[00016] Figure 9 illustrates an example of transverse slip detection according to one or more modalities.
[00017] Figure 10 is a flowchart describing steps in a method according to one or more modalities.
[00018] Figure 11 illustrates the distance limits according to one or more modalities.
[00019] Figure 12 is a flowchart describing steps in a method according to one or more modalities.
[00020] Figure 13 illustrates the distance limits according to one or more modalities.
[00021] Figure 14 is a flowchart describing steps in a method according to one or more modalities.
[00022] Figure 15 illustrates a transverse sliding gesture according to one or more modalities.
[00023] Figure 16 is a flowchart describing steps in a method according to one or more modalities.
[00024] Figure 17 illustrates an exemplary computing device that can be used to implement various modalities described herein. Detailed Description Overview
[00025] Cross-sliding gestures for touch screen are described. In at least some embodiments, cross-swipe gestures can be used over content that pans or scrolls in one direction, to allow for additional actions, such as content selection, drag-and-drop operations, and the like.
[00026] In one or more modalities, a transverse swipe gesture can be performed by dragging an item or object in a direction that is different, for example, orthogonal, from a pan or scroll direction. Dragging can be performed through a touch-related drag, such as through a finger, stylus, pen and the like, through a mouse/trackpad drag and the like. In the examples described in this document, a touch-related drag is used. The different drag direction can be mapped to additional actions or functionality. In one or more modalities, one or more boundaries can be used, such as a distance boundary in combination with dragging from a different direction, to map to additional actions or functionality. For example, in the context of a horizontally scrollable list, dragging an object vertically a short distance and releasing it can mark an object as selected, while dragging the object a greater distance vertically can release the object from an associated list so that it can be dropped elsewhere.
[00027] In at least some modalities so called speed bumps, or other perceptual indices such as visual indices, can be used to provide a user with an understanding or awareness of limits.
[00028] Several modalities described here allow an item to be dragged without necessarily entering a mode. A mode can be thought of as an action that is initiated by a user that is not necessarily related to manipulating an item directly. For example, a mode can be entered by clicking on a specific user interface button and then exposed to functionality that can be performed against an item or object. In the described modalities, modes can be avoided by eliminating, in at least some cases, user interface elements to access drag functionality.
[00029] In the following discussion, an exemplary environment is first described that is operable to employ the gesture techniques described here. Exemplary illustrations of gestures and procedures are then described, which can be used in the exemplary setting as well as in other settings. Consequently, the exemplary environment is not limited to performing the exemplary gestures and the gestures are not limited to implementation in the exemplary environment. Exemplary Environment
[00030] Figure 1 is an illustration of an environment 100 in an exemplary implementation that is operable to employ transverse swipe gestures as described herein. Environment 100 illustrated includes an example of a computing device 102 that can be configured in a variety of modes. For example, the computing device 102 may be configured as a traditional computer (e.g., a desktop personal computer, a laptop computer, and so on), a mobile station, an entertainment device, a frequency converter. coupled to a television, a cordless phone, a netbook, a game console, a handheld device, and so on as further described in relation to Figure 2. Thus, computing device 102 may vary from full-resource devices such as substantial memory and processor resources (eg personal computers, game consoles) to a low-resource device with a memory and/or processing resources (eg traditional frequency converters, handheld game consoles). Computing device 102 also includes software that causes computing device 102 to perform one or more operations as described below.
[00031] The computing device 102 includes a gesture module 104 that is operative to provide a gesture functionality as described in this document. The gesture module can be implemented in connection with any suitable type of hardware, software, firmware or combinations thereof. In at least some modes, the gesture module is implemented in software that resides on some sort of tangible, computer-readable storage medium, examples of which are provided below.
[00032] The gesture module 104 is representative of a functionality that recognizes gestures, including transverse swipe gestures that can be performed by one or more fingers, and causes operations to be performed that correspond to the gestures. Gestures can be recognized by module 104 in a variety of different ways. For example, the gesture module 104 may be configured to recognize touch input, such as a user's hand finger 106a with the nearest screen device 108 of computing device 102 using a touch screen functionality. Specifically, the gesture module 104 can recognize transverse swipe gestures that can be used over content that pans or scrolls in one direction, for additional permissions, such as content selection, drag-and-drop operations, and the like.
[00033] For example, in the illustrated example, a pan or scroll direction is shown as being in the vertical direction, as indicated by the arrows. In one or more modalities, a transverse swipe gesture can be performed by dragging an item or object in a direction that is different, for example, orthogonal, from the pan or scroll direction. Drag from different direction can be mapped to additional actions or functionality. With respect to whether a direction is vertical or horizontal, a vertical direction can be thought of, in at least some cases, as a direction that is generally parallel to one side of a display device, and a horizontal direction can be thought of as a direction that is generally orthogonal to the vertical direction. With this, although the orientation of a computing device can change, the verticality or horizontality of a transverse swipe gesture can remain the default as defined in relation to and along the screen device.
[00034] For example, a finger of the user 106a's hand is illustrated as selecting 110 an image 112 displayed by the screen device 108. The selection 110 of the image 112 and the subsequent movement of the finger of the user 106a's hand in a direction that is different from the pan or scroll direction, for example generally orthogonal to the pan or scroll direction, can be recognized by the gesture module 104. The gesture module 104 can then identify this recognized movement, by nature and movement character, as indicating a "drag and drop" operation to change the location of the image 112 to a point on the screen at which the user's hand finger 106a is lifted away from the screen device 108. recognition of touch input that describes image selection, movement of the selection point to another location, and then lifting the user's hand finger 106a can be used to identify a gesture (eg, a drag-and-drop gesture) which should initiate the drag-and-drop operation.
[00035] Although transverse swipe gestures are primarily discussed in this document, it should be appreciated and understood that a variety of different types of gestures may be recognized by gesture module 104 including, by way of example and not limitation, gestures that are input types (for example, touch gestures such as the previously written drag-and-drop gesture) as well as gestures involving multiple types of input. For example, module 104 can be used to recognize single finger gestures and frame gestures, multiple finger gestures/same hand and frame gestures and/or multiple finger gestures/different hands and frame gestures.
[00036] For example, computing device 102 may be configured to detect and differentiate between a touch input (for example, provided by one or more fingers of the user's hand 106a) and a stylus input (for example, provided by a stylet 116). Differentiation can be performed in a variety of ways, such as detecting an amount of screen device 108 that is contacted by the user's finger 106a versus an amount of screen device 108 that is contacted by stylus 116.
[00037] Thus, the gesture module 104 can support a variety of different gesture techniques by recognizing and leveraging a split between stylus and touch inputs, as well as different types of touch inputs.
[00038] Figure 2 illustrates an exemplary system that shows the gesture module 104 as being implemented in an environment where multiple devices are interconnected through a central computing device. The central computing device can be local to multiple devices or can be located remotely from multiple devices. In one embodiment, the central computing device is a "cloud" server farm, which comprises one or more server computers that are connected to multiple devices through a network or the Internet or other means.
[00039] In one embodiment, this interconnect architecture allows functionality to be provided across multiple devices to provide a common, uninterrupted experience for users of multiple devices. Each of the multiple devices may have different physical requirements and capabilities, and the core computing device utilizes a platform to enable delivery of an experience for the device that is both modeled for the device and yet common to all devices. In one modality, a "class" of target device is created and experiences are modeled for the generic class of devices. A device class can be defined by physical characteristics or usage or other common characteristics of devices. For example, as previously described computing device 102 can be configured in a variety of different modes, such as for mobile uses 202, computer 204, and television 206. Each of these configurations has a generally corresponding screen size and so the computing device 102 may be configured as one of these device classes in this exemplary system 200. For example, computing device 102 may take the mobile device class 202 which includes mobile phones, music players, gaming devices, and so on. Computing device 102 may also assume a class of computing device 204 which includes personal computers, laptop computers, netbooks, and so on. Television configuration 206 includes device configurations that involve viewing in a casual setting, for example, televisions, frequency converters, game consoles, and so on. Thus, the techniques described herein can be supported by these various configurations of computing device 102 and are not limited to the specific examples described in the following sections.
[00040] The cloud 208 is illustrated as including a platform 210 for web services 212. The platform 210 abstracts an underlying hardware functionality (eg servers) and software resources of the cloud 208 and thus can act as a "system of cloud operation". For example, platform 210 can abstract resources to connect computing device 102 with other computing devices. Platform 210 can also serve to abstract resource scaling to provide a level of scale corresponding to the demand found for web services 212 that are implemented through platform 210. A variety and other examples are also contemplated, such as load balancing servers across a server park, protection against malicious parts (eg, spam, viruses, and other malware), and so on.
[00041] Thus, the cloud 208 is included as a part of the strategy with respect to the software and hardware resources that are made available to the computing device 102 via the Internet or other networks. For example, the gesture module 104 can be implemented in part in the computing device 102 as well as through a platform 210 that supports the web services 212.
[00042] For example, the gesture techniques supported by the gesture module can be detected using a touch screen functionality in mobile configuration 202, a trackpad functionality of computer configuration 204, detected by a camera as part of support of natural user interface (NUI) that does not involve contacting a specific input device, and so on. Furthermore, the performance of operations to detect and recognize inputs to identify a specific gesture may be distributed across the entire system 200, such as by computing device 102 and/or web services 212 supported by platform 210 of cloud 208 .
[00043] Generally, any of the functions described here can be implemented using software, firmware, hardware (for example, fixed logic circuit), manual processing, or a combination of these implementations. The terms "module", "functionality" and "logic" as used herein generally represent software, firmware, hardware, or a combination thereof. In the case of a software implementation, the module, functionality, or logic represents program code that performs specified tasks when executed on or by a processor (eg, CPU or CPUs). Program code may be stored on one or more computer-readable memory devices. The characteristics of gesture techniques described below are platform independent, meaning that the techniques can be implemented on a variety of commercial computing platforms that have a variety of processors.
[00044] In the discussion that follows, several sections describe exemplary transverse slip gestures including rearrangement gestures. The section entitled "Transverse Slip Gesture/Method for Pan Direction" describes a transverse swipe gesture that can be performed relative to a pan direction according to one or more modalities. Next, a section titled "Method/Gesture for Rearranging Items in a Panning List" describes how items can be arranged and rearranged using a transversal swipe gesture according to one or more modalities. Afterwards, a section entitled "Detecting Cross Slip Gestures" describes how cross slip gestures can be detected according to one or more modalities. The following section titled "Combining-DoMultiple Interactions" describes how multiple interactions can be combined together with the transverse swipe gestures according to one or more modalities. After that, a section entitled "Direct Manipulation to Facilitate Boundary for Discernibility" describes how a direct manipulation feedback can be provided to allow a user to become aware of various boundaries according to one or more modalities. Below, a section entitled "Interaction Feedback" describes modalities in which a feedback can be provided to a user according to one or more modalities. Finally, a section entitled "Exemplary Device" describes the aspects of an exemplary device that can be used to implement one or more modalities. Method/Gesture for Transverse Sliding for Panning Direction
[00045] In one or more modalities, a transverse swipe gesture can be performed to cause an object-related action to be performed by dragging an item or object in a direction that is different, for example, orthogonal, from a scroll direction or panning.
[00046] As an example, consider Figure 3 which illustrates an environment 300 according to one or more modalities. Here, computing device 302 includes a screen device 308 whose contents can be scrolled or panned in the horizontal direction, as indicated by double headed arrow 304, and as suggested by scroll bar 305. Screen device 308 displayed in it, multiple different objects or items 310, 312, 314, 316, 318, 320, 322, 324 which are shown in their entirety, and partial objects or items 326, 328. In this example, a user can scroll or panning in the horizontal direction using a sweep gesture over the screen device 308 in the horizontal direction. Alternatively, a user can cause an object-relative action to be performed by performing a swipe gesture across, relative to one of the objects or items, in a direction that is different from the scroll or pan direction.
[00047] As an example, consider the lowermost illustration of computing device 302. There, a user's hand 306a touched over item 312 and moved it in a direction that is different from the scroll or pan direction. In this specific example, the different direction is usually orthogonal to the scrolling or panning direction in a descending direction. It should be appreciated and understood that, in at least some embodiments, the object can be moved up and down, or more generally bidirectionally, to access the same or different actions relating to the object. Any suitable type of object-related action can be performed. For example, an object-related action type might include, by way of example and not limitation, object selection. Note, in this example, that the selected item is directly manipulated and a visual feedback is provided to the user being able to observe the object move responsive to the user's engagement. Note also that, in this modality and the others described below, the object-related action is performed without showing additional user interface elements, such as a button to allow a command selection. Other object-related actions can be performed such as object deletion and other object manipulation actions.
[00048] As another example consider Figure 4, which illustrates an environment 400 according to one or more modalities. Here, computing device 402 includes a screen device 408 whose contents can be scrolled or panned in the vertical direction, as indicated by double-headed arrow 404, and as suggested by scroll bar 405. screen 408 displayed, thereon, multiple different objects or items 410, 412, 414, 416, 418, 420, 422, 424 which are shown in their entirety. In this example, a user can scroll or pan in the vertical direction using a sweep gesture over the screen device 408 in the vertical direction. Alternatively, a user can cause an object-relative action to be performed by performing a swipe gesture across one of the objects or items in a direction that is different from the scroll or pan direction.
[00049] As an example, consider the lowermost illustration of computing device 402. There, a hand of user 406a touched over item 412 and moved it in a direction that is different from the scroll or pan direction. In this specific example, the different direction is usually orthogonal to the scroll or pan direction. Any suitable type of object-related action can be performed, examples of which are provided below. For example, an object-related action type might include, by way of example and not limitation, object selection. It should be appreciated and understood that a functionality that is accessible via cross-swipe gestures can be accessed in connection with moving the object or item any threshold distance to invoke the object-relative action. In at least some modalities, there may not be a limit distance to invoke an object-relative action. In these cases, movement in a direction other than the scroll or pan direction can be used to invoke the object-relative action.
[00050] Figure 5 is a flowchart describing the steps in a method according to one or more modalities. The method can be performed in connection with any suitable hardware, software, firmware, or combination thereof. In at least some embodiments, the method can be performed by a suitably configured management module such as the one described above.
[00051] Step 500 detects a gestural slip input relative to a screen device associated with a computing device. Step 502 assesses whether the gestural glide input direction is different from a pan direction. If the direction is not different from the pan direction, step 504 pans the content in the direction of the gestural slide entry. If, on the other hand, the direction of the gestural slip input is different from the pan direction, step 506 performs an object-relative action. Any suitable type of action relating to the object may be performed, examples of which are provided below. Method/Gesture for Rearranging Items in a Panable List
[00052] In one or more modalities, a transverse swipe gesture can be performed effective to cause an object-relative action, in the form of an object rearrangement action, to be performed by dragging an item or object in a direction that is different, for example, orthogonal to or generally not in the direction associated with a scroll or pan direction.
[00053] As an example, consider Figure 6 which illustrates an environment 600 according to one or more modalities. Here, computing device 602 includes a screen device 608 whose contents can be scrolled or panned in the horizontal direction, as indicated by double-headed arrow 604, as suggested by scroll bar 605. in it, multiple different objects or items 610, 612, 614, 616, 618, 620, 622, 624 which are shown in their entirety, and partial objects or items 626, 628. In this example, a user can perform a scrolling or panning in the horizontal direction using a hard sweep gesture over the screen device 608 in the horizontal direction. Alternatively, a user can cause an object-relative action, in the form of a rearrangement action, to be performed by performing a transverse swipe gesture, relative to one of the objects or items, in a direction that is different from the direction. scrolling or panning. For example, in the uppermost illustration, a user 600a's hand touched screen device 608 over object 612 and dragged the object in a first direction that is generally orthogonal to the scroll or pan direction, and then a second direction in the direction. from the lower left corner of the display device 608. Here the first direction is a generally vertical direction. Dragging the object in the first direction indicates to the gesture module that an object must be rearranged.
[00054] Now consider the lowermost illustration of computing device 602. Here, user 600a's hand has dragged object 612 into its illustrated position and dropped it into place. Subsequently, the user's hand touched the screen device 608 over the object 618 and dragged the object in a first direction which is generally orthogonal to the user's hand 600a, and then in a second direction towards the middle portion of the screen device. Here, the first direction is a generally vertical direction. Once the user's hand is lifted from the screen device 608 touched, the object 618 will be dropped into its illustrated location.
[00055] As another example, consider Figure 7, which illustrates an environment 700 according to one or more modalities. Here, computing device 702 includes a display device 708 whose contents can be scrolled or panned in the vertical direction, as indicated by the double headed arrow 704, and as suggested by the scroll bar 705. The display device 708 has displayed, in it, multiple different objects or items 710, 712, 714, 716, 718, 720, 722, 724. In this example, a user can scroll or pan in the vertical direction using a sweep gesture on the disk. 708 screen positive in the vertical direction. Alternatively, a user can cause an object-relative action, in the form of a rearrangement action, to be performed by performing a transverse swipe gesture, relative to one of the objects or items, in a direction that is different from the scroll direction. or panning. For example, in the uppermost illustration, a user's hand 706a touched screen device 708 over object 712 and dragged the object in a direction that is generally orthogonal to the scroll or pan direction. Here the direction is a generally horizontal direction. Dragging an object in this direction indicates to the gesture module that the object should be rearranged.
[00056] Now consider the lowermost illustration of computing device 702. Here, user's hand 706a has dragged object 712 to its illustrated position and dropped it into place. Subsequently, the user's hand touched the screen device 708 over the object 710 and air swept the object in a direction that is generally orthogonal to the scroll or pan direction. Here, the direction is a generally horizontal direction. Once the user's hand is lifted from the screen device 708 touched, the object 710 will be dropped into its illustrated location.
[00057] Figure 8 is a flowchart describing the steps in a method according to one or more modalities. The method can be performed in connection with any suitable hardware, software, firmware, or combinations thereof. In at least some embodiments, the method can be performed by a suitably configured gesture module such as the one described above.
[00058] Step 800 detects a drag direction associated with a drag operation relative to a screen device associated with a computing device. Step 802 evaluates whether the drag direction is different from a pan direction. If the drag direction is not different from the pan direction, step 804 pans the content in the drag direction. If, on the other hand, the drag direction is different from the pan direction, step 806 performs an object arrange action. Examples of how this can be done are provided above. In one or more modalities, the rearrangement can take place in any suitable direction. Detecting Transverse Slip Gestures
[00059] Transverse swipe gestures can be detected in any suitable mode. As just an example of how cross-sliding gestures can be detected, consider the following in connection with Figure 9. In one or more modes, to detect whether a user is panning or in cross-sliding, a detection logic of region can be applied as graphically illustrated in Figure 9.
[00060] In this example, assume the user has displayed a horizontally panned list of items. When the user places their finger on an object, as inside circle 900 illustrated, and begins to drag their finger outside the circle boundary, region detection can be employed to evaluate the result of the drag. For example, in a situation where there is a drag into a 902 region, the content will be panned in a corresponding direction. However, a drag into one of region 904 will be recognized as a transverse swipe gesture and, consequently, functionality associated with the transverse swipe gesture can be implemented.
[00061] In the illustrated example, regions 902 and 904 are generally similarly sized. However, based on the scenario, certain actions can be prioritized by changing the angle or range of entry angles, for example angles a and b, of different regions. For example, by making angles a larger thereby increasing their range, (and angles b smaller thereby decreasing their range) it is easier to start panning without accidentally performing a transverse swipe gesture, and vice versa. versa.
[00062] Figure 10 is a flowchart describing the steps in a method according to one or more modalities. The method can be performed in connection with any suitable hardware, software, firmware, or combination thereof. In at least some embodiments, the method can be performed by a suitably configured management module such as the one described above.
[00063] Step 1000 defines one or more regions associated with a pan gesture. Any suitable region geometry can be used, an example of which is provided above. Step 1002 defines one or more regions associated with a transverse swipe gesture. Again, any suitable region geometry can be used, an example of which is provided above. In the example in Figure 9, region geometries are typically triangular in shape and converge at a point associated with a touch input. Other geometries can be used without departing from the spirit and scope of the claimed subject.
[00064] Step 1004 detects a drag operation. This step can be performed by detecting gestural input in the form of a touch gesture, such as a sweep. Step 1006 evaluates an associated region within which the drag operation takes place. If, in step 1008, the region is associated with a pan gesture, step 1010 pans the content in an associated direction. If, on the other hand, the region is not associated with a pan gesture, step 1012 performs an operation associated with a transverse swipe gesture. Any suitable object-related action can be performed, including, by way of example and not limitation, an object selection, object deletion, object rearrangement, and the like.
[00065] Having considered how drag operations can be detected and differentiated according to one or more modalities, now consider a discussion of how multiple interactions can be combined. Combining Multiple Interactions
[00066] In some cases, it may be desirable to have a limit that can be used to lock in an object-relative action, such as a drag limit that allows you to lock in a drag direction. Any suitable type of limit can be used including, by way of example and not limitation, distance limits, speed limits, directionality limits, any combination of the above mentioned limits, as well as other limits. For example, a combination of distance and speed limit can be used to mitigate what might otherwise constitute an accidental or unintended action. For example, when a specific limit is reached, the finger movement speed could be evaluated. If the speed is below a specific threshold then a drag action could be invoked. If this is above a specific threshold, then perhaps an object selection action would be performed.
[00067] This makes it possible for the user to be less precise at the beginning of their gesture. For example, returning to the example in Figure 9, note that box 906 is set. When the user's finger is within the box 906 or alternatively within the boundary of a circle 900, the corresponding gesture may be in an "undecided" state. Once the finger crosses outside the box (or circle) boundary a decision as to the gesture can be made. In practice this can be handled in a couple of different ways. First, neither a pan operation nor a transverse swipe functionality can be implemented until the finger has crossed the boundary of box 906. Alternatively, both pan and traverse operations can be implemented simultaneously as the finger is within the boundary of box 906. As soon as the finger crosses the boundary of the box, the operation associated with that specific region can be kept, while the other operation can be cancelled.
[00068] Once a transverse swipe gesture has been detected, different limits can be used to implement the object-related actions. As an example, consider Figure 11. Here, an object or item 1100 is shown. Various distances are shown and are indicated at 1102 and 1104. Distances show the displacement distance of the object 1100. In one or more modalities, the first distance 1102 is a threshold which, when passed, results in an action potentially being committed. . In this specific example, passing this distance limit while performing a drag operation causes object 1100 to be selected. To commit this action, the user would lift their finger, and the dragged object would slide back to its original position and change its state to be selected. The area under this which corresponds to the distance 1102 before the distance limit 1104 is reached can be treated as temporary storage. So releasing the object within this area would still result in object selection.
[00069] Once the dragged object (dragged along the solid line or any other suitable direction such as along the dashed line) reaches the distance 1104 and crosses its boundary, the next action relative to the object on the gesture transverse slip can be omitted. In this specific example, the object-relative action can break the object out of its list or associated position on the screen device, and thus allow the user to drag and drop the object in any direction. In one or more modalities, if the object reaches line 1106, this can trigger further object-related actions. For example, crossing this line with the object could trigger additional visual feedback to make it clear to the user that the drag-and-drop limit has been reached.
[00070] It should be appreciated and understood that any suitable number of distance boundaries may be employed and may be associated with object-related actions. For example, a first boundary could be defined by the illustrated circle boundary within the object 1100, a second boundary by distance 1102, and a third boundary by distance 1104. In one or more embodiments, a move outside the first boundary can crash the object associated in the associated movement direction.
[00071] Figure 12 is a flowchart describing the steps in a method according to one or more modalities. The method can be performed in connection with any suitable hardware, software, firmware, or combination thereof. In at least some embodiments, the method can be performed by a suitably configured gesture module such as the one described above.
[00072] Step 1200 sets one or more limits. This step can be performed in any suitable mode using any suitable type of boundary. For example, in the modality described above, distance limits were employed. It should be appreciated and understood, however, that other types of limits and/or their combinations can be used without departing from the spirit and scope of the claimed subject matter.
[00073] In the illustrated and described embodiment, the defined limits can be used in connection with a transverse sliding gesture as described above. Step 1202 detects a transverse slide gesture. Examples of how this can be done are provided above. Step 1204 detects one or more threshold triggers. For example, once a user has engaged touching an object they can move the object to a specific location. This step probes when the object has been removed sufficiently to trigger one or more thresholds. In modalities where limits are defined in terms of distances, the step can be performed by detecting when an object has moved a specific distance.
[00074] Step 1206 detects whether a user's action indicates that an object-related action should be committed. This step can be performed in any suitable mode. For example, the user action could include lifting your finger off of a specific object. If a user's action does not indicate that an object-related action should be committed, step 1208 does not commit the object-related action. For example, the user could end the cross swipe gesture in a specific mode so that no action should be taken. For example, the transverse swipe gesture can be reversible, that is, if the user starts to drag an object down, then he can, at any time while still holding the object, slide it back to its position original. By doing this, no transverse slip action will be taken. Alternatively, one or more limits could be crossed, without a user having yet indicated that the object-related action must be committed. In this case, if the transverse swipe gesture is taking place, the method would continue to monitor for threshold triggers as returning to step 1204. If, on the other hand, a user action indicates that an object-relative action should be committed , step 1210 commits the object-relative action associated with the last triggered threshold. This step may be performed in any suitable manner and may include any suitable object-related action, examples of which are provided above.
[00075] In one or more embodiments, the multiple different directions used for the traverse sliding functionality can either result in the same object-related actions being performed, or different object-related actions being performed. For example, an object selection could occur when an object is dragged down, while a drag and drop action could be performed when the object is dragged up.
[00076] Having considered the use of multiple drag limits and associated object-relative actions, now consider an additional example that employs limits along with indexes to provide direct object manipulation feedback. Direct Manipulation to Facilitate Boundary for Discernibility
[00077] In at least some modalities, a direct manipulation can provide visual feedback so that a user can visually observe an object move and, according to the object's movement, can be provided with visual aids to facilitate discernibility of limit. Any suitable type of visual aid may be employed including, by way of example and not limitation, tool tips, icons, glyphs, and the like. In the example described just below, so-called speed reducers can be used to provide a user with an understanding or awareness of the various limits that may be present. As an example, consider Figure 13.
[00078] There, an object or item 1300 is shown. Various distances are shown and are indicated 1302, 1304, and 1306. The distances show the object displacement distance 1300 or the distances through which the object can be displaced. In one or more modalities, the first distance 1302 is a threshold which, when passed, results in an action potentially being taken. In this specific, passing this distance limit while performing a drag operation causes the 1300 object to be selected. To commit this action the user would lift their finger and the dragged object would slide back to its original position and change its state to be selected. The area below that corresponds to the distance 1302 before the region corresponding to the distance 1306 is reached and can be treated as temporary storage. Thus, releasing the object within this area would still result in the object being selected.
[00079] Distance 1306 corresponds to a speed reducer region. The movement of the 1300 object within the speed reducer region is slower than the finger movement. This presents a visual cue or indication that a new limit is about to be reached, thus making it easier for the user to commit a specific action without accidentally moving in and over the next distance limit. For example, within a speed reducer region a user can drag their finger 50 pixels in length, while the corresponding object can move five pixels in distance. Releasing the object within this speed reducer region will result in an associated action being committed. In this example, the associated action is an object selection.
[00080] Once the dragged object proceeds through the speed reducer region corresponding to the distance 1306, and reaches the distance 1304 and crosses its limit, the next action relative to the object in the transverse slide gesture can be committed . In this specific example, the object-relative action can separate the object from its list or associated position on the screen device, and thus allow the user to drag and drop the object in any direction. In one or more modalities, if the object reaches line 1308, this can still trigger additional object related actions. For example, crossing this line with the object could trigger additional visual feedback to make it clear to the user that the drag and drop limit has been reached. In addition, multiple speed bumps can be used in connection with distance limits.
[00081] It should be appreciated and understood that any suitable number of distance limits and speed reducers may be employed and may be associated with object-related actions. Alternatively or in addition, other visual indices can be used to indicate thresholds or threshold changes. For example, while dragging an object, one or more lines can be rendered to indicate boundaries and thus the distance an object must be dragged to perform different actions. Visual features can also be traced over the object itself as it approaches or crosses a boundary.
[00082] Figure 14 is a flowchart describing the steps in a method according to one or more modalities. The method can be performed in connection with any suitable hardware, software, firmware, or combination thereof. In at least some embodiments, the method can be performed by a suitably configured gesture module such as the one described above.
[00083] Step 1400 defines one or more distance limits that include one or more speed bumps. This step can be performed in any suitable mode. In the illustrated and described embodiment, the defined speed limits and reducers can be used in connection with a transverse sliding gesture as described above. Step 1402 detects a transverse slide gesture. Examples of how this can be done are provided above. Step 1404 detects a speed reducer crossing. For example, once a user has engaged touching an object the user can move the object in a specific direction. This step detects when the object has moved sufficiently to cross a threshold associated with a speed reducer.
[00084] Step 1406 modifies a user experience within the speed bump region. Any suitable modification of the user experience can be provided. For example, in at least some modalities, modifying the user experience can lead to modifying the user's visual experience. For example, and as noted above, the user's finger may move faster than the underlying object. Alternatively or in addition, other experience modifications may occur including, by way of example and not limitation providing an audible or haptic feedback to indicate the presence within a specific speed reducer region.
[00085] Step 1408 detects whether a user action indicates that an object-related action should be committed. This step can be performed in any suitable mode. For example, the user action could include lifting your finger from a specific object. If a user action does not indicate that an object-relative action is to be committed, step 1410 does not commit the object-relative action. For example, the user could end the transverse swipe gesture in a specific mode such that no action should be taken. For example, the transverse swipe gesture can be reversible, that is, if the user starts to drag an object down, then he can, at any time while still holding the object, slide it back to its original position . By doing this, no transverse slip action will be taken. Alternatively, one or more limits and one or more speed reducer regions could be crossed, without a user having yet indicated that the object-relative action is to be committed. In this case, if the transverse slip gesture is continuing, the method would continue to monitor for threshold crossings and additional speed bumps as appropriate. If, on the other hand, a user action indicates that an object-relative action must be committed, step 1412 commits the object-relative action associated with a last crossed boundary. This step may be performed in any suitable manner and may include any suitable object-related action, examples of which are provided above.
[00086] In one or more embodiments, the multiple different directions used for the traverse sliding functionality can either result in the same object-related actions being performed, or different object-related actions being performed. For example, object selection could occur when an object is dragged down, while a drag and drop action could be performed when the object is dragged up. Return of Interaction
[00087] In one or more modalities, a visual feedback can be provided to a user to inform the user of a specific object-related action that will be committed, responsive to the detected transverse slide gesture. For example, as a specific object passes through different distance limits, visual indices can be provided to inform the user of a specific action that will be taken by releasing the object. Alternatively, or in addition to this, visual indices can additionally be provided on an action relating to a specific object that could be nearby if dragging of the object continues.
[00088] As an example, consider Figure 15. There, an object in the form of an image is shown generically at 1500. In the uppermost part of the Figure, a user tapped the object to initiate a drag operation. As the user drags the object downwards, as shown in 1502, visual indices 1504 can be shown as beginning to emerge from under the image. In this specific example, the visual index resides in the form of a checkbox that gradually emerges from under the object. Any suitable type of visual index can be used without departing from the spirit and scope of the claimed subject. For example, the visual index could be presented in the form of a line, under the image, and onto which the image must be dragged to perform a specific action. Once the object has been dragged a specific distance, as shown in 1506, the visual index - here the checkbox - can be fully exposed thus informing the user that he can release the object to commit an action relating to object. In this specific example, the object-relative action comprises an object selection. Thus, the fully exposed checkbox can indicate that the action is complete.
[00089] Figure 16 is a flowchart describing the steps in a method according to one or more modalities. The method can be performed in connection with any suitable hardware, software, firmware, or combination thereof. In at least some embodiments, the method can be performed by a suitably configured gesture module such as the one described above.
[00090] Step 1600 detects a drag operation associated with an object. Examples of how this can be done are provided above. For example, the drag operation can be detected in conjunction with a transverse swipe gesture. Step 1602 presents a partial visual index portion associated with the completion of an object-relative action. Examples of how this can be done are provided above. Step 1604 evaluates whether the drag operation continues. If the drag operation has not continued, the method may, in a case where a user has not finished the drag operation, return to step 1602. In the case where the user has finished the drag operation, such as returning the object to its original position, the method may terminate. On the other hand, if the drag operation continues, step 1606 evaluates whether a distance threshold associated with an object-relative action has been reached. If not, the method can return to step 1602. By doing this, more of the visual index can be exposed according to how far the object was air swept. If, on the other hand, a distance limit associated with an object-relative action has been reached, step 1608 presents a complete visual index associated with the commit of the object-relative action. By doing this, the visual index can visually inform the user that the object-related action can be committed such as by the user removing their finger from the object.
[00091] Having described an exemplary visual index associated with committing an object-related action associated with a transverse swipe gesture, consider now an exemplary device that can be used to implement one or more of the modalities described above. Exemplary Device
[00092] Figure 17 illustrates various components of an exemplary device 1700 that can be implemented as any type of portable and/or computer device as described with reference to Figures 1 and 2 to implement the modalities of the gesture techniques described herein. Device 1700 includes communication devices 1702 that enable wired or wireless communication of device 1704 data (e.g., data received, data being received, data scheduled for transmission, data packets of data, etc.) . The 1704 device data or other device content may include device configuration settings, media content stored on the device, and/or information associated with a user of the device. The media content stored on the 1700 device can include any type of audio, video, and/or image data. Device 1700 includes one or more data inputs 1706 through which any type of data, media content, and/or inputs may be received, such as user selectable inputs, messages, music, television media content. , recorded video content, and any other type of audio, video, and/or image data received from any content and/or data source.
[00093] The 1700 device also includes 1708 communication interfaces that can be implemented as any one or more of a serial and/or parallel interface, a wireless interface, any type of network interface, a modem, and like any other. type of communication interface. Communication interfaces 1708 provide a connection and/or communication links between device 1700 and a communication network by which other electronic, computing, and communication devices communicate data with device 1700.
[00094] Device 1700 includes one or more processors 1710 (e.g., any of microprocessors, controllers, and the like) which process various executable or computer-readable instructions to control the operation of device 1700 and implement gesture modalities. described above. Alternatively, or in addition, device 1700 may be implemented with any one or a combination of hardware, firmware, or fixed logic circuitry that is implemented in connection with the processing and control circuits which are generically identified at 1712. Although not shown, the device 1700 may include a system bus or data transfer system that couples the various components within the device. A system bus can include any one or a combination of different bus structures, such as a memory bus or memory controller, a peripheral bus, a universal serial bus, and/or a processor or local bus that uses either. of a variety of bus architectures.
[00095] Device 1700 also includes a computer-readable medium 1714, such as one or more memory components, examples of which include random access memory (RAM), non-volatile memory (e.g., any one or more a read-only memory (ROM), flash memory, EPROM, EE PROM, etc.), and a disk storage device. A disk storage device can be implemented as any type of magnetic or optical storage device, such as a hard disk drive, a recordable and/or rewriteable compact disk (CD), any type of digital versatile disk ( DVD), and the like. Device 1700 may also include a mass storage media device 1716.
[00096] Computer readable medium 1714 provides data storage mechanisms to store the data of device 1704, as well as various applications of device 1718 and any other types of information and/or data relating to the operational aspects of device 1700. For example, an operating system 1720 may be maintained as a computer application with computer readable medium 1714 and run on processors 1710. Device applications 1718 may include a device manager (e.g., a control application, a software application, a signal processing and control module, code that is native to a specific device, a hardware abstraction layer for a specific device, etc.), as well as other applications that may include, web browsers , image processing applications, communication applications such as instant messaging applications, word processing applications and a variety of other different applications. Device 1718 applications also include any system components or modules to implement the modalities of the management techniques described herein. In this example, device applications 1718 include an interface application 1722 and a gesture capture unit 1724 which are shown as software modules and/or computer applications. The gesture capture unit 1724 is representative of software that is used to interface with a device configured to capture a gesture, such as a touch screen, trackpad, camera, and so on. Alternatively or in addition, the interface application 1722 and the gesture capture unit 1724 may be implemented as hardware, software, firmware, or any combination thereof.
[00097] The 1700 device also includes a 1726 audio and/or video input-output system that provides audio data to a 1728 audio system and/or provides video data to a 1730 screen system. 1728 and/or screen system 1730 may include any devices that process, display, and/or otherwise render audio, video, and image data. Video signals and audio signals can be communicated from the 1700 device to an audio device and/or a display device via an RF (radio frequency) connection, S-video connection, video connection composite, component video connection, DVI (digital video interface), analog audio connection, or other similar communication connection. In one embodiment, audio system 1728 and/or display system 1730 are implemented as external components to device 1700. Alternatively, audio system 1728 and/or display system 1730 are implemented as integrated components of the device. 1700 exemplar device. Conclusion
[00098] Cross-sliding gestures for touch screens are described. In at least some embodiments, cross-swipe gestures can be used over content that pans or scrolls in one direction, to allow for additional actions, such as content selection, drag-and-drop operations, and the like.
[00099] In one or more modalities, a transverse swipe gesture can be performed by dragging an item or object in a direction that is different from a scrolling direction. Different direction drag can be mapped to additional actions or functionality. In one or more modalities, one or more boundaries can be used, such as a distance boundary, in combination with drag from different direction, to map additional actions or functionality.
[000100] In at least some modes, so-called speed reducers can be used to provide a user with an understanding or awareness of limits.
[000101] Although the modalities have been described in a specific language for structural features and/or methodological acts, it should be understood that the modalities defined in the appended claims are not necessarily limited to the specific characteristics or acts described. Rather, specific features or acts are described as exemplary ways of implementing the claimed modalities.
权利要求:
Claims (5)
[0001]
1. Method characterized in that it comprises the steps of: detecting (500) a gesture slip input relative to a screen device (108, 308, 408, 608, 708) of a computing device (102, 302, 402, 602, 702), where the gesture swipe input is applied by a finger of a user and starts at an object (1300) displayed touched by the finger, the screen device (108, 308, 408, 608, 708) displays content that is panable in a single pan direction; determining (502) whether the direction of the gesture slip input is different from the pan direction; and responsive to the gesture slip input being in a direction that is different from the pan direction, perform a drag operation on the object (1300) and, based on an object displacement distance (1300) during the drag operation tar, perform one of the following operations: if the displacement distance exceeds a first threshold distance (1302) and is less than a second threshold distance, then the finger is lifted from the display device (108, 308, 408, 608, 708), select the object (1300) and slide the object (1300) back to its original position; if the displacement distance exceeds the second threshold distance and is less than a third threshold distance (1304), move the object (1300) slower than the finger moves on the screen device (108, 308, 408, 608, 708), and selecting the object (1300) if the finger is then lifted from the screen device (108, 308, 408, 608, 708); and if the offset distance exceeds the third threshold distance (1304), enable the user to drag and drop the object (1300) in any direction.
[0002]
2. Method according to claim 1, characterized in that the panning direction is vertical along the screen device (108, 308, 408, 608, 708).
[0003]
3. Method according to claim 1, characterized in that the panning direction is horizontal along the screen device (108, 308, 408, 608, 708).
[0004]
4. Method according to claim 1, characterized in that the direction that is different from the pan direction comprises a direction that is generally orthogonal to the pan direction.
[0005]
5. Computer readable storage memory characterized by the fact that it comprises the method as defined in any one of claims 1 to 4.
类似技术:
公开号 | 公开日 | 专利标题
BR112014002379B1|2021-07-27|COMPUTER-READABLE STORAGE METHOD AND MEMORY
JP5980913B2|2016-08-31|Edge gesture
JP6038898B2|2016-12-07|Edge gesture
JP5684291B2|2015-03-11|Combination of on and offscreen gestures
US8631354B2|2014-01-14|Focal-control user interface
TWI607368B|2017-12-01|Method and system for target disambiguation and correction
US20140372923A1|2014-12-18|High Performance Touch Drag and Drop
BR112014005006B1|2021-08-10|METHOD IMPLEMENTED BY COMPUTER
US20130067392A1|2013-03-14|Multi-Input Rearrange
WO2016099556A1|2016-06-23|3d visualization
US9348498B2|2016-05-24|Wrapped content interaction
KR101399145B1|2014-05-30|Gui widget for stable holding and control of smart phone based on touch screen
US20140108982A1|2014-04-17|Object placement within interface
US20170052694A1|2017-02-23|Gesture-based interaction method and interaction apparatus, and user equipment
NZ620528B2|2016-05-03|Cross-slide gesture to select and rearrange
TW201606634A|2016-02-16|Display control apparatus, display control method, and computer program for executing the display control method
KR101899916B1|2018-09-18|Method for controlling a display device at the edge of an information element to be displayed
US20170185282A1|2017-06-29|Gesture recognition method for a touchpad
KR101436586B1|2014-09-02|Method for providing user interface using one point touch, and apparatus therefor
TW201608461A|2016-03-01|Methods for controlling displayed content and electronic apparatus thereof
同族专利:
公开号 | 公开日
CO6890079A2|2014-03-10|
BR112014002379A2|2018-09-25|
RU2623198C2|2017-06-27|
EP2740022B1|2021-04-21|
CA2843607C|2019-01-15|
KR20140058519A|2014-05-14|
EP2740022A1|2014-06-11|
JP2014522054A|2014-08-28|
HK1199520A1|2015-07-03|
CA2843607A1|2013-02-07|
IN2014CN00621A|2015-04-03|
IL230724D0|2014-03-31|
AU2012290559B2|2016-12-15|
US20130033525A1|2013-02-07|
CN103907087A|2014-07-02|
WO2013019404A1|2013-02-07|
KR102052771B1|2019-12-05|
US20130044141A1|2013-02-21|
MX2014001342A|2014-05-12|
AU2012290559A1|2014-02-20|
MX338046B|2016-03-31|
CN103907087B|2017-06-16|
NZ620528A|2016-01-29|
JP5980924B2|2016-08-31|
EP2740022A4|2015-04-29|
US8687023B2|2014-04-01|
RU2014103238A|2015-08-10|
CL2014000244A1|2014-08-22|
MY167640A|2018-09-21|
引用文献:
公开号 | 申请日 | 公开日 | 申请人 | 专利标题

US4823283A|1986-10-14|1989-04-18|Tektronix, Inc.|Status driven menu system|
US5189732A|1987-11-18|1993-02-23|Hitachi, Ltd.|Touch panel input apparatus|
JPH01147647A|1987-12-03|1989-06-09|Mitsubishi Electric Corp|Data processor|
US5046001A|1988-06-30|1991-09-03|Ibm Corporation|Method for accessing selected windows in a multi-tasking system|
US5321750A|1989-02-07|1994-06-14|Market Data Corporation|Restricted information distribution system apparatus and methods|
US5339392A|1989-07-27|1994-08-16|Risberg Jeffrey S|Apparatus and method for creation of a user definable video displayed document showing changes in real time data|
US5526034A|1990-09-28|1996-06-11|Ictv, Inc.|Interactive home information system with signal assignment|
US5297032A|1991-02-01|1994-03-22|Merrill Lynch, Pierce, Fenner & Smith Incorporated|Securities trading workstation|
FR2693810B1|1991-06-03|1997-01-10|Apple Computer|USER INTERFACE SYSTEMS WITH DIRECT ACCESS TO A SECONDARY DISPLAY AREA.|
US5258748A|1991-08-28|1993-11-02|Hewlett-Packard Company|Accessing and selecting multiple key functions with minimum keystrokes|
JP3341290B2|1991-09-10|2002-11-05|ソニー株式会社|Video display device|
JP2654283B2|1991-09-30|1997-09-17|株式会社東芝|Icon display method|
JP2827612B2|1991-10-07|1998-11-25|富士通株式会社|A touch panel device and a method for displaying an object on the touch panel device.|
US6061062A|1991-12-20|2000-05-09|Apple Computer, Inc.|Zooming controller|
US5640176A|1992-01-24|1997-06-17|Compaq Computer Corporation|User interface for easily setting computer speaker volume and power conservation levels|
JPH07306955A|1992-07-24|1995-11-21|Walt Disney Co:The|Method and system for generation of three-dimensional illusion|
US5508717A|1992-07-28|1996-04-16|Sony Corporation|Computer pointing device with dynamic sensitivity|
US5432932A|1992-10-23|1995-07-11|International Business Machines Corporation|System and method for dynamically controlling remote processes from a performance monitor|
US5463725A|1992-12-31|1995-10-31|International Business Machines Corp.|Data processing system graphical user interface which emulates printed material|
DE69432199T2|1993-05-24|2004-01-08|Sun Microsystems, Inc., Mountain View|Graphical user interface with methods for interfacing with remote control devices|
US5598523A|1994-03-31|1997-01-28|Panasonic Technologies, Inc.|Method and system for displayed menu activation using a matching distinctive arrangement of keypad actuators|
US5914720A|1994-04-21|1999-06-22|Sandia Corporation|Method of using multiple perceptual channels to increase user absorption of an N-dimensional presentation environment|
US5495566A|1994-11-22|1996-02-27|Microsoft Corporation|Scrolling contents of a window|
US5623613A|1994-11-29|1997-04-22|Microsoft Corporation|System for displaying programming information|
US5611060A|1995-02-22|1997-03-11|Microsoft Corporation|Auto-scrolling during a drag and drop operation|
US5819284A|1995-03-24|1998-10-06|At&T Corp.|Personalized real time information display as a portion of a screen saver|
US5793415A|1995-05-15|1998-08-11|Imagetel International Inc.|Videoconferencing and multimedia system|
US6807558B1|1995-06-12|2004-10-19|Pointcast, Inc.|Utilization of information “push” technology|
US5860073A|1995-07-17|1999-01-12|Microsoft Corporation|Style sheets for publishing system|
US5687331A|1995-08-03|1997-11-11|Microsoft Corporation|Method and system for displaying an animated focus item|
US5712995A|1995-09-20|1998-01-27|Galileo Frames, Inc.|Non-overlapping tiling apparatus and method for multiple window displays|
US5574836A|1996-01-22|1996-11-12|Broemmelsiek; Raymond M.|Interactive display apparatus and method with viewer position compensation|
US6008816A|1996-04-25|1999-12-28|Microsoft Corporation|Method and system for managing color specification using attachable palettes and palettes that refer to other palettes|
US5675329A|1996-05-09|1997-10-07|International Business Machines Corporation|Method of obtaining a second function from keys on a keyboard using pressure differentiation|
US5771042A|1996-07-17|1998-06-23|International Business Machines Corporation|Multi-size control for multiple adjacent workspaces|
US5963204A|1996-09-20|1999-10-05|Nikon Corporation|Electronic camera with reproduction and display of images at the same timing|
US6064383A|1996-10-04|2000-05-16|Microsoft Corporation|Method and system for selecting an emotional appearance and prosody for a graphical character|
US6057839A|1996-11-26|2000-05-02|International Business Machines Corporation|Visualization tool for graphically displaying trace data produced by a parallel processing computer|
US6216141B1|1996-12-06|2001-04-10|Microsoft Corporation|System and method for integrating a document into a desktop window on a client computer|
US5905492A|1996-12-06|1999-05-18|Microsoft Corporation|Dynamically updating themes for an operating system shell|
US5959621A|1996-12-06|1999-09-28|Microsoft Corporation|System and method for displaying data items in a ticker display pane on a client computer|
US6211921B1|1996-12-20|2001-04-03|Philips Electronics North America Corporation|User interface for television|
US6009519A|1997-04-04|1999-12-28|Andrea Electronics, Corp.|Method and apparatus for providing audio utility software for use in windows applications|
US6028600A|1997-06-02|2000-02-22|Sony Corporation|Rotary menu wheel interface|
WO1999010799A1|1997-08-22|1999-03-04|Natrificial Llc|Method and apparatus for simultaneously resizing and relocating windows within a graphical display|
KR100300972B1|1997-09-19|2001-09-03|윤종용|Texture mapping system and texture cache access method|
US6008809A|1997-09-22|1999-12-28|International Business Machines Corporation|Apparatus and method for viewing multiple windows within a dynamic window|
US9197599B1|1997-09-26|2015-11-24|Verizon Patent And Licensing Inc.|Integrated business system for web based telecommunications management|
US6266098B1|1997-10-22|2001-07-24|Matsushita Electric Corporation Of America|Function presentation and selection using a rotatable function menu|
US5940076A|1997-12-01|1999-08-17|Motorola, Inc.|Graphical user interface for an electronic device and method therefor|
US6449638B1|1998-01-07|2002-09-10|Microsoft Corporation|Channel definition architecture extension|
US9292111B2|1998-01-26|2016-03-22|Apple Inc.|Gesturing with a multipoint sensing device|
US6011542A|1998-02-13|2000-01-04|Sony Corporation|Graphical text entry wheel|
US6278448B1|1998-02-17|2001-08-21|Microsoft Corporation|Composite Web page built from any web content|
WO1999046711A1|1998-03-13|1999-09-16|Aspen Technology, Inc.|Computer method and apparatus for automatic execution of software applications|
US6108003A|1998-03-18|2000-08-22|International Business Machines Corporation|Maintaining visibility and status indication of docked applications and application bars|
FR2776415A1|1998-03-20|1999-09-24|Philips Consumer Communication|ELECTRONIC APPARATUS HAVING A SCREEN AND METHOD FOR DISPLAYING GRAPHICS|
US6784925B1|1998-03-24|2004-08-31|Canon Kabushiki Kaisha|System to manage digital camera images|
US6448987B1|1998-04-03|2002-09-10|Intertainer, Inc.|Graphic user interface for a digital content delivery system using circular menus|
US6104418A|1998-04-06|2000-08-15|Silicon Magic Corporation|Method and system for improved memory interface during image rendering|
JPH11298572A|1998-04-07|1999-10-29|Nec Shizuoka Ltd|Receiver and method for displaying received information|
US6311058B1|1998-06-30|2001-10-30|Microsoft Corporation|System for delivering data content over a low bit rate transmission channel|
US6212564B1|1998-07-01|2001-04-03|International Business Machines Corporation|Distributed application launcher for optimizing desktops based on client characteristics information|
US6611272B1|1998-07-02|2003-08-26|Microsoft Corporation|Method and apparatus for rasterizing in a hierarchical tile order|
AR020608A1|1998-07-17|2002-05-22|United Video Properties Inc|A METHOD AND A PROVISION TO SUPPLY A USER REMOTE ACCESS TO AN INTERACTIVE PROGRAMMING GUIDE BY A REMOTE ACCESS LINK|
US6369837B1|1998-07-17|2002-04-09|International Business Machines Corporation|GUI selector control|
US6832355B1|1998-07-28|2004-12-14|Microsoft Corporation|Web page display system|
US6188405B1|1998-09-14|2001-02-13|Microsoft Corporation|Methods, apparatus and data structures for providing a user interface, which exploits spatial memory, to objects|
US20020018051A1|1998-09-15|2002-02-14|Mona Singh|Apparatus and method for moving objects on a touchscreen display|
US6865297B2|2003-04-15|2005-03-08|Eastman Kodak Company|Method for automatically classifying images into events in a multimedia authoring application|
US6510553B1|1998-10-26|2003-01-21|Intel Corporation|Method of streaming video from multiple sources over a network|
JP3956553B2|1998-11-04|2007-08-08|富士ゼロックス株式会社|Icon display processing device|
US6597374B1|1998-11-12|2003-07-22|Microsoft Corporation|Activity based remote control unit|
US6337698B1|1998-11-20|2002-01-08|Microsoft Corporation|Pen-based interface for a notepad computer|
US6510466B1|1998-12-14|2003-01-21|International Business Machines Corporation|Methods, systems and computer program products for centralized management of application programs on a network|
US6577350B1|1998-12-21|2003-06-10|Sony Corporation|Method and apparatus for displaying an electronic program guide|
US6396963B2|1998-12-29|2002-05-28|Eastman Kodak Company|Photocollage generation and modification|
US6628309B1|1999-02-05|2003-09-30|International Business Machines Corporation|Workspace drag and drop|
US7283620B2|1999-02-26|2007-10-16|At&T Bls Intellectual Property, Inc.|Systems and methods for originating and sending a voice mail message to an instant messaging platform|
US6463304B2|1999-03-04|2002-10-08|Openwave Systems Inc.|Application launcher for a two-way mobile communications device|
US6281940B1|1999-03-31|2001-08-28|Sony Corporation|Display of previewed channels with rotation of multiple previewed channels along an arc|
US6710771B1|1999-05-13|2004-03-23|Sony Corporation|Information processing method and apparatus and medium|
US6505243B1|1999-06-02|2003-01-07|Intel Corporation|Automatic web-based detection and display of product installation help information|
US6456334B1|1999-06-29|2002-09-24|Ati International Srl|Method and apparatus for displaying video in a data processing system|
US6426753B1|1999-07-01|2002-07-30|Microsoft Corporation|Cache memory for high latency and out-of-order return of texture data|
US6577323B1|1999-07-01|2003-06-10|Honeywell Inc.|Multivariable process trend display and methods regarding same|
US6971067B1|1999-08-23|2005-11-29|Sentillion, Inc.|Application launchpad|
US6976210B1|1999-08-31|2005-12-13|Lucent Technologies Inc.|Method and apparatus for web-site-independent personalization from multiple sites having user-determined extraction functionality|
US6424338B1|1999-09-30|2002-07-23|Gateway, Inc.|Speed zone touchpad|
DE60035324T2|1999-10-26|2008-02-28|Iontas Ltd., Moville|Monitoring of computer usage|
US7987431B2|1999-10-29|2011-07-26|Surfcast, Inc.|System and method for simultaneous display of multiple information sources|
US6724403B1|1999-10-29|2004-04-20|Surfcast, Inc.|System and method for simultaneous display of multiple information sources|
US7028264B2|1999-10-29|2006-04-11|Surfcast, Inc.|System and method for simultaneous display of multiple information sources|
US6697825B1|1999-11-05|2004-02-24|Decentrix Inc.|Method and apparatus for generating and modifying multiple instances of element of a web site|
US6510144B1|1999-12-07|2003-01-21|Cisco Technology, Inc.|Network layer support to enhance the transport layer performance in mobile and wireless environments|
US6820111B1|1999-12-07|2004-11-16|Microsoft Corporation|Computer user interface architecture that saves a user's non-linear navigation history and intelligently maintains that history|
US6801203B1|1999-12-22|2004-10-05|Microsoft Corporation|Efficient graphics pipeline with a pixel cache and data pre-fetching|
JP3720230B2|2000-02-18|2005-11-24|シャープ株式会社|Expression data control system, expression data control apparatus constituting the same, and recording medium on which the program is recorded|
US6433789B1|2000-02-18|2002-08-13|Neomagic Corp.|Steaming prefetching texture cache for level of detail maps in a 3D-graphics engine|
KR100460105B1|2000-02-22|2004-12-03|엘지전자 주식회사|Method for searching a menu in a mobile communication terminal|
US20030046396A1|2000-03-03|2003-03-06|Richter Roger K.|Systems and methods for managing resource utilization in information management environments|
US20020152305A1|2000-03-03|2002-10-17|Jackson Gregory J.|Systems and methods for resource utilization analysis in information management environments|
US6721958B1|2000-03-08|2004-04-13|Opentv, Inc.|Optional verification of interactive television content|
US8701027B2|2000-03-16|2014-04-15|Microsoft Corporation|Scope user interface for displaying the priorities and properties of multiple informational items|
US6507643B1|2000-03-16|2003-01-14|Breveon Incorporated|Speech recognition system and method for converting voice mail messages to electronic mail messages|
US6636246B1|2000-03-17|2003-10-21|Vizible.Com Inc.|Three dimensional spatial user interface|
GB2360658B|2000-03-20|2004-09-08|Hewlett Packard Co|Camera with user identity data|
US7155729B1|2000-03-28|2006-12-26|Microsoft Corporation|Method and system for displaying transient notifications|
US7249326B2|2000-04-06|2007-07-24|Microsoft Corporation|Method and system for reducing notification area clutter|
KR100363619B1|2000-04-21|2002-12-05|배동훈|Contents structure with a spiral donut and contents display system|
JP4325075B2|2000-04-21|2009-09-02|ソニー株式会社|Data object management device|
JP4730571B2|2000-05-01|2011-07-20|ソニー株式会社|Information processing apparatus and method, and program storage medium|
US20020133554A1|2000-05-25|2002-09-19|Daniel Checkoway|E-mail answering agent|
US7210099B2|2000-06-12|2007-04-24|Softview Llc|Resolution independent vector display of internet content|
JP2003536177A|2000-06-22|2003-12-02|インテルコーポレイション|Method and system for transferring objects between users or applications|
JP2002014661A|2000-06-29|2002-01-18|Toshiba Corp|Liquid crystal display device and electronic equipment provided therewith|
US6966034B2|2000-06-30|2005-11-15|Microsoft Corporation|Supplemental request header for applications or devices using web browsers|
US6662023B1|2000-07-06|2003-12-09|Nokia Mobile Phones Ltd.|Method and apparatus for controlling and securing mobile phones that are lost, stolen or misused|
US6907273B1|2000-07-07|2005-06-14|Openwave Systems Inc.|Method and system for processing overloaded keys of a mobile device|
GB0017793D0|2000-07-21|2000-09-06|Secr Defence|Human computer interface|
EP1184414A3|2000-08-30|2003-08-06|JSR Corporation|Conjugated diene-based rubber and method of producing the same, oil extended rubber and rubber composition containing the same|
US7043690B1|2000-09-11|2006-05-09|International Business Machines Corporation|Method, system, and program for checking contact information|
SE524595C2|2000-09-26|2004-08-31|Hapax Information Systems Ab|Procedure and computer program for normalization of style throws|
GB0027260D0|2000-11-08|2000-12-27|Koninl Philips Electronics Nv|An image control system|
US7263668B1|2000-11-09|2007-08-28|International Business Machines Corporation|Display interface to a computer controlled display system with variable comprehensiveness levels of menu items dependent upon size of variable display screen available for menu item display|
AU2017202A|2000-11-15|2002-05-27|David M Holbrook|Apparatus and method for organizing and/or presenting data|
US6907574B2|2000-11-29|2005-06-14|Ictv, Inc.|System and method of hyperlink navigation between frames|
US7058955B2|2000-12-06|2006-06-06|Microsoft Corporation|Method and system for passing messages between threads|
CA2328795A1|2000-12-19|2002-06-19|Advanced Numerical Methods Ltd.|Applications and performance enhancements for detail-in-context viewing technology|
US6983310B2|2000-12-29|2006-01-03|International Business Machines Corporation|System and method for providing search capabilties on a wireless device|
US7133859B1|2001-01-05|2006-11-07|Palm, Inc.|Category specific sort and display instructions for an electronic device|
US20020097264A1|2001-01-19|2002-07-25|Ibm Corporation|Apparatus and methods for management of temporal parameters to provide enhanced accessibility to computer programs|
US7069207B2|2001-01-26|2006-06-27|Microsoft Corporation|Linguistically intelligent text compression|
US6938101B2|2001-01-29|2005-08-30|Universal Electronics Inc.|Hand held device having a browser application|
SE519884C2|2001-02-02|2003-04-22|Scalado Ab|Method for zooming and producing a zoomable image|
US7735021B2|2001-02-16|2010-06-08|Microsoft Corporation|Shortcut system for use in a mobile electronic device and method thereof|
US6798421B2|2001-02-28|2004-09-28|3D Labs, Inc. Ltd.|Same tile method|
US20020129061A1|2001-03-07|2002-09-12|Swart Stacey J.|Method and apparatus for creating files that are suitable for hardcopy printing and for on-line use|
US7295836B2|2001-03-09|2007-11-13|Research In Motion Limited|Advanced voice and data operations in a mobile data communication device|
US7017119B1|2001-03-15|2006-03-21|Vaultus Mobile Technologies, Inc.|System and method for display notification in a tabbed window setting|
US6972776B2|2001-03-20|2005-12-06|Agilent Technologies, Inc.|Scrolling method using screen pointing device|
US6904597B2|2001-03-30|2005-06-07|Intel Corporation|Inter-thread communications between different components using double buffer|
US7734285B2|2001-04-03|2010-06-08|Qualcomm Incorporated|Method and apparatus for network initiated uninstallation of application program over wireless network|
US6778192B2|2001-04-05|2004-08-17|International Business Machines Corporation|System and method for creating markers on scroll bars of a graphical user interface|
US6990638B2|2001-04-19|2006-01-24|International Business Machines Corporation|System and method for using shading layers and highlighting to navigate a tree view display|
US20020161634A1|2001-04-27|2002-10-31|Koninklijke Philips Electronics N.V.|Electronic document with an automatically updated portion|
WO2002089108A1|2001-04-30|2002-11-07|Broadband Graphics, Llc|Cell based eui methods and apparatuses|
US6907447B1|2001-04-30|2005-06-14|Microsoft Corporation|Method and apparatus for providing an instant message notification|
US20020186251A1|2001-06-07|2002-12-12|International Business Machines Corporation|Method, apparatus and computer program product for context-sensitive scrolling|
PT1271896E|2001-06-18|2004-12-31|Swisscom Mobile Ag|METHOD AND SYSTEM FOR INTERNET PROTOCOL MECHANISMS IN HETEROGENETIC NETWORKS|
JP2003009244A|2001-06-25|2003-01-10|Fuji Photo Film Co Ltd|Image data transmitter and controlling method thereof|
US6975836B2|2001-06-28|2005-12-13|Kabushiki Kaisha Toshiba|Data broadcasting system, receiving terminal device, contents providing server, and contents providing method|
KR100420280B1|2001-07-09|2004-03-02|삼성전자주식회사|Menu display method of mobile terminal|
US6876312B2|2001-07-10|2005-04-05|Behavior Tech Computer Corporation|Keyboard with multi-function keys|
US6987991B2|2001-08-17|2006-01-17|Wildseed Ltd.|Emoticon input method and apparatus|
FR2828970B1|2001-08-27|2003-12-19|Cit Alcatel|INTEROPERABILITY SYSTEM BETWEEN MMS MESSAGES AND SMS / EMS MESSAGES AND RELATED EXCHANGE METHOD|
US6690365B2|2001-08-29|2004-02-10|Microsoft Corporation|Automatic scrolling|
US20030096604A1|2001-08-29|2003-05-22|Jorg Vollandt|Method of operating an electronic device, in particular a mobile telephone|
US7093201B2|2001-09-06|2006-08-15|Danger, Inc.|Loop menu navigation apparatus and method|
US6912695B2|2001-09-13|2005-06-28|Pixia Corp.|Data storage and retrieval system and method|
US7036091B1|2001-09-24|2006-04-25|Digeo, Inc.|Concentric curvilinear menus for a graphical user interface|
US20030073414A1|2001-10-15|2003-04-17|Stephen P. Capps|Textual and telephony dual input device|
US6857104B1|2001-10-17|2005-02-15|At&T Corp|Organizing graphical user interfaces to reveal hidden areas|
US7333092B2|2002-02-25|2008-02-19|Apple Computer, Inc.|Touch pad for handheld device|
US7487262B2|2001-11-16|2009-02-03|At & T Mobility Ii, Llc|Methods and systems for routing messages through a communications network based on message content|
JP2003162355A|2001-11-26|2003-06-06|Sony Corp|Display switching method of task, portable equipment, and portable communication equipment|
AU2002357029A1|2001-11-30|2003-06-17|A New Voice, Inc.|Method and system for contextual prioritization of unified messages|
US20030135582A1|2001-12-21|2003-07-17|Docomo Communications Laboratories Usa, Inc.|Context aware search service|
US6690387B2|2001-12-28|2004-02-10|Koninklijke Philips Electronics N.V.|Touch-screen image scrolling system and method|
US7139800B2|2002-01-16|2006-11-21|Xerox Corporation|User interface for a message-based system having embedded information management capabilities|
FI116425B|2002-01-18|2005-11-15|Nokia Corp|Method and apparatus for integrating an extensive keyboard into a small apparatus|
EP1469375B1|2002-01-22|2011-07-13|Fujitsu Limited|Menu element selecting device and method|
WO2003062975A1|2002-01-22|2003-07-31|Fujitsu Limited|Menu element selecting device and method|
US7146573B2|2002-01-28|2006-12-05|International Business Machines Corporation|Automatic window representation adjustment|
US7019757B2|2002-01-28|2006-03-28|International Business Machines Corporation|Changing the alpha levels of an application window to indicate a status of a computing task|
US20040078299A1|2002-01-31|2004-04-22|Kathleen Down-Logan|Portable color and style analysis, match and management system|
US7031977B2|2002-02-28|2006-04-18|Plumtree Software, Inc.|Efficiently storing indented threads in a threaded discussion application|
US6952207B1|2002-03-11|2005-10-04|Microsoft Corporation|Efficient scenery object rendering|
US7610563B2|2002-03-22|2009-10-27|Fuji Xerox Co., Ltd.|System and method for controlling the display of non-uniform graphical objects|
US7127685B2|2002-04-30|2006-10-24|America Online, Inc.|Instant messaging interface having a tear-off element|
US7689649B2|2002-05-31|2010-03-30|Aol Inc.|Rendering destination instant messaging personalization items before communicating with destination|
US7779076B2|2002-05-31|2010-08-17|Aol Inc.|Instant messaging personalization|
US20080048986A1|2002-06-10|2008-02-28|Khoo Soon H|Compound Computing Device with Dual Portion Keyboards Controlled by a Single Processing Element|
WO2004001578A1|2002-06-21|2003-12-31|Nokia Corporation|Mobile communication device having music player navigation function and method of operation thereof|
US6873329B2|2002-07-05|2005-03-29|Spatial Data Technologies, Inc.|System and method for caching and rendering images|
US7302648B1|2002-07-10|2007-11-27|Apple Inc.|Method and apparatus for resizing buffered windows|
US7658562B2|2002-07-12|2010-02-09|Dana Suess|Modified-QWERTY letter layout for rapid data entry|
WO2004008404A1|2002-07-12|2004-01-22|Dana Suess|Modified-qwerty letter layout for rapid data entry|
US7111044B2|2002-07-17|2006-09-19|Fastmobile, Inc.|Method and system for displaying group chat sessions on wireless mobile terminals|
US7089507B2|2002-08-12|2006-08-08|International Business Machines Corporation|System and method for display views using a single stroke control|
US6707890B1|2002-09-03|2004-03-16|Bell South Intellectual Property Corporation|Voice mail notification using instant messaging|
US7065385B2|2002-09-12|2006-06-20|Sony Ericsson Mobile Communications Ab|Apparatus, methods, and computer program products for dialing telephone numbers using alphabetic selections|
US20040068543A1|2002-10-03|2004-04-08|Ralph Seifert|Method and apparatus for processing e-mail|
US7913183B2|2002-10-08|2011-03-22|Microsoft Corporation|System and method for managing software applications in a graphical user interface|
JP2004133733A|2002-10-11|2004-04-30|Sony Corp|Display device, display method, and program|
KR200303655Y1|2002-11-19|2003-02-14|강성윤|Folder-type Mobile phone which is convenient for character message transmission|
CA2414378A1|2002-12-09|2004-06-09|Corel Corporation|System and method for controlling user interface features of a web application|
US7600234B2|2002-12-10|2009-10-06|Fisher-Rosemount Systems, Inc.|Method for launching applications|
AU2002953555A0|2002-12-23|2003-01-16|Canon Kabushiki Kaisha|Method for presenting hierarchical data|
US7321824B1|2002-12-30|2008-01-22|Aol Llc|Presenting a travel route using more than one presentation style|
JP2004227393A|2003-01-24|2004-08-12|Sony Corp|Icon drawing system, icon drawing method and electronic device|
US6885974B2|2003-01-31|2005-04-26|Microsoft Corporation|Dynamic power control apparatus, systems and methods|
US7158123B2|2003-01-31|2007-01-02|Xerox Corporation|Secondary touch contextual sub-menu navigation for touch screen interface|
US7606714B2|2003-02-11|2009-10-20|Microsoft Corporation|Natural language classification within an automated response system|
US20040185883A1|2003-03-04|2004-09-23|Jason Rukman|System and method for threading short message service messages with multimedia messaging service messages|
US7075535B2|2003-03-05|2006-07-11|Sand Codex|System and method for exact rendering in a zooming user interface|
US7313764B1|2003-03-06|2007-12-25|Apple Inc.|Method and apparatus to accelerate scrolling for buffered windows|
US7480872B1|2003-04-06|2009-01-20|Apple Inc.|Method and apparatus for dynamically resizing windows|
GB2421667A|2003-04-22|2006-06-28|Spinvox Ltd|Queuing and load balancing of voicemail for intelligent transcription into text message|
US7102626B2|2003-04-25|2006-09-05|Hewlett-Packard Development Company, L.P.|Multi-function pointing device|
US7388579B2|2003-05-01|2008-06-17|Motorola, Inc.|Reduced power consumption for a graphics accelerator and display|
US8555165B2|2003-05-08|2013-10-08|Hillcrest Laboratories, Inc.|Methods and systems for generating a zoomable graphical user interface|
US7173623B2|2003-05-09|2007-02-06|Microsoft Corporation|System supporting animation of graphical display elements through animation object instances|
JP4177713B2|2003-05-30|2008-11-05|京セラ株式会社|Imaging device|
JP2005004396A|2003-06-11|2005-01-06|Sony Corp|Information display method, information display unit, and computer program|
GB2404630B|2003-08-07|2006-09-27|Research In Motion Ltd|Cover plate for a mobile device having a push-through dial keypad|
US7669140B2|2003-08-21|2010-02-23|Microsoft Corporation|System and method for providing rich minimized applications|
US7308288B2|2003-08-22|2007-12-11|Sbc Knowledge Ventures, Lp.|System and method for prioritized interface design|
US7725419B2|2003-09-05|2010-05-25|Samsung Electronics Co., Ltd|Proactive user interface including emotional agent|
KR100566122B1|2003-09-15|2006-03-30| 멀티비아|Method of compressing still pictures for mobile devices|
US7433920B2|2003-10-10|2008-10-07|Microsoft Corporation|Contact sidebar tile|
US7231231B2|2003-10-14|2007-06-12|Nokia Corporation|Method and apparatus for locking a mobile telephone touch screen|
US7224963B2|2003-10-17|2007-05-29|Sony Ericsson Mobile Communications Ab|System method and computer program product for managing themes in a mobile phone|
US20050085215A1|2003-10-21|2005-04-21|Nokia Corporation|Method and related apparatus for emergency calling in a touch screen mobile phone from a touch screen and keypad lock active state|
US20050090239A1|2003-10-22|2005-04-28|Chang-Hung Lee|Text message based mobile phone configuration system|
US7644376B2|2003-10-23|2010-01-05|Microsoft Corporation|Flexible architecture for notifying applications of state changes|
US7461151B2|2003-11-13|2008-12-02|International Business Machines Corporation|System and method enabling future messaging directives based on past participation via a history monitor|
US7370284B2|2003-11-18|2008-05-06|Laszlo Systems, Inc.|User interface for displaying multiple applications|
US7814419B2|2003-11-26|2010-10-12|Nokia Corporation|Changing an orientation of a user interface via a course of motion|
US7454713B2|2003-12-01|2008-11-18|Sony Ericsson Mobile Communications Ab|Apparatus, methods and computer program products providing menu expansion and organization functions|
KR100871404B1|2003-12-01|2008-12-02|리서치 인 모션 리미티드|Previewing a new event on a small screen device|
EP1538536A1|2003-12-05|2005-06-08|Sony International GmbH|Visualization and control techniques for multimedia digital content|
US7103388B2|2003-12-16|2006-09-05|Research In Motion Limited|Expedited communication graphical user interface system and method|
EP1557837A1|2004-01-26|2005-07-27|Sony International GmbH|Redundancy elimination in a content-adaptive video preview system|
US20050198584A1|2004-01-27|2005-09-08|Matthews David A.|System and method for controlling manipulation of tiles within a sidebar|
US20050164688A1|2004-01-27|2005-07-28|Kyocera Corporation|Mobile terminal, method for controlling mobile telephone terminal, and mobile telephone terminal|
US7296184B2|2004-01-28|2007-11-13|Microsoft Corporation|Method and system for masking dynamic regions in a user interface to enable testing of user interface consistency|
US7403191B2|2004-01-28|2008-07-22|Microsoft Corporation|Tactile overlay for an imaging display|
US8001120B2|2004-02-12|2011-08-16|Microsoft Corporation|Recent contacts and items|
US20050183021A1|2004-02-13|2005-08-18|Allen Joel E.|Method for electronically packaging a user's personal computing environment on a computer or device, and mobilizing it for transfer over a network|
JP4071726B2|2004-02-25|2008-04-02|シャープ株式会社|Portable information device, character display method in portable information device, and program for realizing the method|
US20050198159A1|2004-03-08|2005-09-08|Kirsch Steven T.|Method and system for categorizing and processing e-mails based upon information in the message header and SMTP session|
WO2005089286A2|2004-03-15|2005-09-29|America Online, Inc.|Sharing social network information|
GB0406451D0|2004-03-23|2004-04-28|Patel Sanjay|Keyboards|
US7599790B2|2004-03-23|2009-10-06|Google Inc.|Generating and serving tiles in a digital mapping system|
FI20040446A|2004-03-24|2005-09-25|Nokia Corp|Procedure for administering application hardware, electronic device and computer software product|
US7289806B2|2004-03-30|2007-10-30|Intel Corporation|Method and apparatus for context enabled search|
US7912904B2|2004-03-31|2011-03-22|Google Inc.|Email system with conversation-centric user interface|
US8027276B2|2004-04-14|2011-09-27|Siemens Enterprise Communications, Inc.|Mixed mode conferencing|
US8448083B1|2004-04-16|2013-05-21|Apple Inc.|Gesture control of multimedia editing applications|
EP1589444A3|2004-04-21|2008-03-12|Samsung Electronics Co., Ltd.|Method, medium, and apparatus for detecting situation change of digital photos and method, medium, and apparatus for situation-based photo clustering in digital photo album|
US7202802B2|2004-04-27|2007-04-10|Wildseed Ltd.|Reduced keypad|
US8707209B2|2004-04-29|2014-04-22|Microsoft Corporation|Save preview representation of files being created|
US7663607B2|2004-05-06|2010-02-16|Apple Inc.|Multipoint touchscreen|
EP1596613A1|2004-05-10|2005-11-16|Dialog Semiconductor GmbH|Data and voice transmission within the same mobile phone call|
US7386807B2|2004-05-17|2008-06-10|Microsoft Corporation|System and method for monitoring application response and providing visual treatment|
US7353466B2|2004-05-28|2008-04-01|Microsoft Corporation|System and method for generating message notification objects on dynamically scaled timeline|
EP1766940A4|2004-06-04|2012-04-11|Systems Ltd Keyless|System to enhance data entry in mobile and fixed environment|
US7434058B2|2004-06-07|2008-10-07|Reconnex Corporation|Generating signatures over a document|
US7469380B2|2004-06-15|2008-12-23|Microsoft Corporation|Dynamic document and template previews|
US7761800B2|2004-06-25|2010-07-20|Apple Inc.|Unified interest layer for user interface|
US7464110B2|2004-06-30|2008-12-09|Nokia Corporation|Automated grouping of image and other user data|
US7388578B2|2004-07-01|2008-06-17|Nokia Corporation|Touch display PDA phone with slide keypad|
US7669135B2|2004-07-15|2010-02-23|At&T Mobility Ii Llc|Using emoticons, such as for wireless devices|
US20060015726A1|2004-07-19|2006-01-19|Callas Jonathan D|Apparatus for partial authentication of messages|
US7958115B2|2004-07-29|2011-06-07|Yahoo! Inc.|Search systems and methods using in-line contextual queries|
JP2006042171A|2004-07-29|2006-02-09|Olympus Corp|Camera, reproducing apparatus and album registration method|
US8479122B2|2004-07-30|2013-07-02|Apple Inc.|Gestures for touch sensitive input devices|
US7653883B2|2004-07-30|2010-01-26|Apple Inc.|Proximity detector in handheld device|
US7178111B2|2004-08-03|2007-02-13|Microsoft Corporation|Multi-planar three-dimensional user interface|
US7181373B2|2004-08-13|2007-02-20|Agilent Technologies, Inc.|System and methods for navigating and visualizing multi-dimensional biological data|
US7559053B2|2004-08-24|2009-07-07|Microsoft Corporation|Program and system performance data correlation|
KR20060019198A|2004-08-27|2006-03-03|서동휘|Method and device for transmitting and receiving graphic emoticons, and method for mapping graphic emoticons|
US7434173B2|2004-08-30|2008-10-07|Microsoft Corporation|Scrolling web pages using direct interaction|
US7619615B1|2004-08-31|2009-11-17|Sun Microsystems, Inc.|Method and apparatus for soft keys of an electronic device|
KR100854333B1|2004-09-02|2008-09-02|리얼네트웍스아시아퍼시픽 주식회사|Method for processing call establishment by using character string|
US8473848B2|2004-09-15|2013-06-25|Research In Motion Limited|Palette-based color selection within a user interface theme|
US20070061488A1|2004-09-20|2007-03-15|Trilibis Inc.|System and method for flexible user interfaces|
US8510657B2|2004-09-30|2013-08-13|Microsoft Corporation|Editing the text of an arbitrary graphic via a hierarchical list|
US20060074735A1|2004-10-01|2006-04-06|Microsoft Corporation|Ink-enabled workflow authoring|
US20060075360A1|2004-10-04|2006-04-06|Edwards Systems Technology, Inc.|Dynamic highlight prompting apparatus and method|
KR100738069B1|2004-10-04|2007-07-10|삼성전자주식회사|Method and apparatus for category-based photo clustering in digital photo album|
US7512966B2|2004-10-14|2009-03-31|International Business Machines Corporation|System and method for visually rendering resource policy usage information|
KR100597670B1|2004-10-18|2006-07-07|주식회사 네오엠텔|mobile communication terminal capable of reproducing and updating multimedia content, and method for reproducing the same|
US7657842B2|2004-11-12|2010-02-02|Microsoft Corporation|Sidebar tile free-arrangement|
US20060103623A1|2004-11-15|2006-05-18|Nokia Corporation|Method and apparatus to enter text in a phone dialer entry field|
KR100703690B1|2004-11-19|2007-04-05|삼성전자주식회사|User interface and method for managing icon by grouping using skin image|
US7581034B2|2004-11-23|2009-08-25|Microsoft Corporation|Sending notifications to auxiliary displays|
EP1662760A1|2004-11-30|2006-05-31|Sony Ericsson Mobile Communications AB|Method for providing alerts in a mobile device and mobile device therefor|
KR100809585B1|2004-12-21|2008-03-07|삼성전자주식회사|Device and method for processing schedule-related event in wireless terminal|
WO2007065019A2|2005-12-02|2007-06-07|Hillcrest Laboratories, Inc.|Scene transitions in a zoomable user interface using zoomable markup language|
US7073908B1|2005-01-11|2006-07-11|Anthony Italo Provitola|Enhancement of depth perception|
US7478326B2|2005-01-18|2009-01-13|Microsoft Corporation|Window information switching system|
US7317907B2|2005-01-31|2008-01-08|Research In Motion Limited|Synchronizing server and device data using device data schema|
US7571189B2|2005-02-02|2009-08-04|Lightsurf Technologies, Inc.|Method and apparatus to implement themes for a handheld device|
US20060184901A1|2005-02-15|2006-08-17|Microsoft Corporation|Computer content navigation tools|
US8819569B2|2005-02-18|2014-08-26|Zumobi, Inc|Single-handed approach for navigation of application tiles using panning and zooming|
US20060212806A1|2005-03-18|2006-09-21|Microsoft Corporation|Application of presentation styles to items on a web page|
US20060218234A1|2005-03-24|2006-09-28|Li Deng|Scheme of sending email to mobile devices|
US7725837B2|2005-03-31|2010-05-25|Microsoft Corporation|Digital image browser|
US20060223593A1|2005-04-01|2006-10-05|Ixi Mobile Ltd.|Content delivery system and method for a mobile communication device|
US9141402B2|2005-04-25|2015-09-22|Aol Inc.|Providing a user interface|
US20060246955A1|2005-05-02|2006-11-02|Mikko Nirhamo|Mobile communication device and method therefor|
US7949542B2|2005-05-05|2011-05-24|Ionosoft, Inc.|System, method and computer program product for graphically illustrating entities and generating a text-based report therefrom|
US8769433B2|2005-05-13|2014-07-01|Entrust, Inc.|Method and apparatus for protecting communication of information through a graphical user interface|
US20070024646A1|2005-05-23|2007-02-01|Kalle Saarinen|Portable electronic apparatus and associated method|
US20060271520A1|2005-05-27|2006-11-30|Ragan Gene Z|Content-based implicit search query|
US7797641B2|2005-05-27|2010-09-14|Nokia Corporation|Mobile communications terminal and method therefore|
US7685530B2|2005-06-10|2010-03-23|T-Mobile Usa, Inc.|Preferred contact group centric interface|
US7684791B2|2005-06-13|2010-03-23|Research In Motion Limited|Multiple keyboard context sensitivity for application usage|
KR100627799B1|2005-06-15|2006-09-25|에스케이 텔레콤주식회사|Method and mobile communication terminal for providing function of integration management of short message service|
US7487467B1|2005-06-23|2009-02-03|Sun Microsystems, Inc.|Visual representation and other effects for application management on a device with a small screen|
US7720834B2|2005-06-23|2010-05-18|Microsoft Corporation|Application launching via indexed data|
US20060294396A1|2005-06-24|2006-12-28|Robert Witman|Multiplatform synchronized data access from mobile devices of dynamically aggregated content|
US7730142B2|2005-07-01|2010-06-01|0733660 B.C. Ltd.|Electronic mail system with functionality to include both private and public messages in a communication|
US20070011610A1|2005-07-11|2007-01-11|Onskreen Inc.|Customized Mobile Device Interface System And Method|
US20070015532A1|2005-07-15|2007-01-18|Tom Deelman|Multi-function key for electronic devices|
US7577918B2|2005-07-15|2009-08-18|Microsoft Corporation|Visual expression of a state of an application window|
EP1920408A2|2005-08-02|2008-05-14|Ipifini, Inc.|Input device having multifunctional keys|
CN100501647C|2005-08-12|2009-06-17|深圳华为通信技术有限公司|Keypad of cell phone and use thereof|
US7925973B2|2005-08-12|2011-04-12|Brightcove, Inc.|Distribution of content|
US8225231B2|2005-08-30|2012-07-17|Microsoft Corporation|Aggregation of PC settings|
KR100757867B1|2005-08-30|2007-09-11|삼성전자주식회사|Apparatus and method of interface in multitasking system|
KR100714700B1|2005-09-06|2007-05-07|삼성전자주식회사|Mobile communication terminal and method for outputting a short message thereof|
US20070061714A1|2005-09-09|2007-03-15|Microsoft Corporation|Quick styles for formatting of documents|
US20070073718A1|2005-09-14|2007-03-29|Jorey Ramer|Mobile search service instant activation|
US7933632B2|2005-09-16|2011-04-26|Microsoft Corporation|Tile space user interface for mobile devices|
US7873356B2|2005-09-16|2011-01-18|Microsoft Corporation|Search interface for mobile devices|
US20070063995A1|2005-09-22|2007-03-22|Bailey Eric A|Graphical user interface for use with a multi-media system|
US8539374B2|2005-09-23|2013-09-17|Disney Enterprises, Inc.|Graphical user interface for electronic devices|
US8860748B2|2005-10-03|2014-10-14|Gary Lynn Campbell|Computerized, personal-color analysis system|
US7869832B2|2005-10-07|2011-01-11|Research In Motion Limited|Device, system, and method for informing users of functions and characters associated with telephone keys|
US20070083821A1|2005-10-07|2007-04-12|International Business Machines Corporation|Creating viewports from selected regions of windows|
US8689147B2|2005-10-07|2014-04-01|Blackberry Limited|System and method for using navigational and other commands on a mobile communication device|
US7280097B2|2005-10-11|2007-10-09|Zeetoo, Inc.|Human interface input acceleration system|
JP2007148927A|2005-11-29|2007-06-14|Alps Electric Co Ltd|Input device and scrolling control method using the same|
US7412663B2|2005-11-30|2008-08-12|Microsoft Corporation|Dynamic reflective highlighting of a glass appearance window frame|
KR100785067B1|2005-12-06|2007-12-12|삼성전자주식회사|Device and method for displaying screen image in wireless terminal|
US9069877B2|2005-12-07|2015-06-30|Ziilabs Inc., Ltd.|User interface with variable sized icons|
US7664067B2|2005-12-15|2010-02-16|Microsoft Corporation|Preserving socket connections over a wireless network|
CN100488177C|2005-12-22|2009-05-13|华为技术有限公司|Method and device for realizing pocket transmission news service|
US7657849B2|2005-12-23|2010-02-02|Apple Inc.|Unlocking a device by performing gestures on an unlock image|
US7480870B2|2005-12-23|2009-01-20|Apple Inc.|Indication of progress towards satisfaction of a user input condition|
EP1804153A1|2005-12-27|2007-07-04|Amadeus s.a.s|User customizable drop-down control list for GUI software applications|
US7509588B2|2005-12-30|2009-03-24|Apple Inc.|Portable electronic device with interface reconfiguration mode|
AU2006332488A1|2005-12-30|2007-07-12|Apple Inc.|Portable electronic device with multi-touch input|
US7895309B2|2006-01-11|2011-02-22|Microsoft Corporation|Network event notification and delivery|
US7657603B1|2006-01-23|2010-02-02|Clearwell Systems, Inc.|Methods and systems of electronic message derivation|
US20070177804A1|2006-01-30|2007-08-02|Apple Computer, Inc.|Multi-touch gesture dictionary|
US7610279B2|2006-01-31|2009-10-27|Perfect Market, Inc.|Filtering context-sensitive search results|
US20070198420A1|2006-02-03|2007-08-23|Leonid Goldstein|Method and a system for outbound content security in computer networks|
US7536654B2|2006-02-06|2009-05-19|Microsoft Corporation|Photo browse and zoom|
US8537117B2|2006-02-13|2013-09-17|Blackberry Limited|Handheld wireless communication device that selectively generates a menu in response to received commands|
KR101033708B1|2006-02-13|2011-05-09|인터내셔널 비지네스 머신즈 코포레이션|Control device, control program, and control method for controlling display of display device for displaying superimposed windows|
JP4844814B2|2006-02-13|2011-12-28|ソニー株式会社|Imaging apparatus and method, and program|
JP2007219830A|2006-02-16|2007-08-30|Fanuc Ltd|Numerical controller|
US20070197196A1|2006-02-22|2007-08-23|Michael Shenfield|Apparatus, and associated method, for facilitating delivery and processing of push content|
US20070208840A1|2006-03-03|2007-09-06|Nortel Networks Limited|Graphical user interface for network management|
US20070214429A1|2006-03-13|2007-09-13|Olga Lyudovyk|System and method for managing application alerts|
TWI300184B|2006-03-17|2008-08-21|Htc Corp|Information navigation methods, and machine readable medium thereof|
US7595810B2|2006-03-22|2009-09-29|Apple Inc.|Methods of manipulating a screen space of a display device|
US20070236468A1|2006-03-30|2007-10-11|Apaar Tuli|Gesture based device activation|
US8111243B2|2006-03-30|2012-02-07|Cypress Semiconductor Corporation|Apparatus and method for recognizing a tap gesture on a touch sensing device|
US8244757B2|2006-03-30|2012-08-14|Microsoft Corporation|Facet-based interface for mobile search|
US20070238488A1|2006-03-31|2007-10-11|Research In Motion Limited|Primary actions menu for a mobile communication device|
US8744056B2|2006-04-04|2014-06-03|Sony Corporation|Communication identifier list configuration|
US8255473B2|2006-04-04|2012-08-28|International Business Machines Corporation|Caching message fragments during real-time messaging conversations|
US8077153B2|2006-04-19|2011-12-13|Microsoft Corporation|Precise selection techniques for multi-touch screens|
US8156187B2|2006-04-20|2012-04-10|Research In Motion Limited|Searching for electronic mail messages with attachments at a wireless communication device|
WO2007121557A1|2006-04-21|2007-11-01|Anand Agarawala|System for organizing and visualizing display objects|
US7636779B2|2006-04-28|2009-12-22|Yahoo! Inc.|Contextual mobile local search based on social network vitality information|
US20070256029A1|2006-05-01|2007-11-01|Rpo Pty Llimited|Systems And Methods For Interfacing A User With A Touch-Screen|
US20070260674A1|2006-05-02|2007-11-08|Research In Motion Limited|Push framework for delivery of dynamic mobile content|
US7646392B2|2006-05-03|2010-01-12|Research In Motion Limited|Dynamic theme color palette generation|
US20070257891A1|2006-05-03|2007-11-08|Esenther Alan W|Method and system for emulating a mouse on a multi-touch sensitive surface|
US9063647B2|2006-05-12|2015-06-23|Microsoft Technology Licensing, Llc|Multi-touch uses, gestures, and implementation|
WO2007134623A1|2006-05-23|2007-11-29|Nokia Corporation|Mobile communication terminal with enhanced phonebook management|
KR20070113018A|2006-05-24|2007-11-28|엘지전자 주식회사|Apparatus and operating method of touch screen|
KR101188083B1|2006-05-24|2012-10-05|삼성전자주식회사|Method for providing idle screen layer given an visual effect and method of providing idle screen|
TW200805131A|2006-05-24|2008-01-16|Lg Electronics Inc|Touch screen device and method of selecting files thereon|
US7953448B2|2006-05-31|2011-05-31|Research In Motion Limited|Keyboard for mobile device|
US8571580B2|2006-06-01|2013-10-29|Loopt Llc.|Displaying the location of individuals on an interactive map display on a mobile communication device|
US8594634B2|2006-06-02|2013-11-26|International Business Machines Corporation|Missed call integration with voicemail and granular access to voicemail|
US7640518B2|2006-06-14|2009-12-29|Mitsubishi Electric Research Laboratories, Inc.|Method and system for switching between absolute and relative pointing with direct input devices|
KR20070120368A|2006-06-19|2007-12-24|엘지전자 주식회사|Method and appratus for controlling of menu - icon|
US20080040692A1|2006-06-29|2008-02-14|Microsoft Corporation|Gesture input|
US7880728B2|2006-06-29|2011-02-01|Microsoft Corporation|Application switching via a touch screen interface|
US7779370B2|2006-06-30|2010-08-17|Google Inc.|User interface for mobile devices|
IL176673D0|2006-07-03|2007-07-04|Fermon Israel|A variably displayable mobile device keyboard|
US20080034284A1|2006-07-28|2008-02-07|Blue Lava Technologies|Method and system for displaying multimedia content|
US20080032681A1|2006-08-01|2008-02-07|Sony Ericsson Mobile Communications Ab|Click-hold Operations of Mobile Device Input Keys|
US7996487B2|2006-08-23|2011-08-09|Oracle International Corporation|Managing searches on mobile devices|
US8564544B2|2006-09-06|2013-10-22|Apple Inc.|Touch screen device, method, and graphical user interface for customizing display of content category icons|
US8014760B2|2006-09-06|2011-09-06|Apple Inc.|Missed telephone call management for a portable multifunction device|
US7941760B2|2006-09-06|2011-05-10|Apple Inc.|Soft keyboard display for a portable multifunction device|
WO2008031871A1|2006-09-13|2008-03-20|Imencro Software Sa|Method for automatically classifying communication between a sender and a recipient|
US7702683B1|2006-09-18|2010-04-20|Hewlett-Packard Development Company, L.P.|Estimating similarity between two collections of information|
WO2008035831A1|2006-09-22|2008-03-27|Gt Telecom, Co., Ltd|Celluar phones having a function of dialing with a searched name|
US20080076472A1|2006-09-22|2008-03-27|Sony Ericsson Mobile Communications Ab|Intelligent Predictive Text Entry|
KR100774927B1|2006-09-27|2007-11-09|엘지전자 주식회사|Mobile communication terminal, menu and item selection method using the same|
SG141289A1|2006-09-29|2008-04-28|Wireless Intellect Labs Pte Lt|An event update management system|
US8756510B2|2006-10-17|2014-06-17|Cooliris, Inc.|Method and system for displaying photos, videos, RSS and other media content in full-screen immersive view and grid-view using a browser feature|
US8891455B2|2006-10-23|2014-11-18|Samsung Electronics Co., Ltd.|Synchronous spectrum sharing by dedicated networks using OFDM/OFDMA signaling|
US20080102863A1|2006-10-31|2008-05-01|Research In Motion Limited|System, method, and user interface for searching for messages associated with a message service on a mobile device|
US8942739B2|2006-11-06|2015-01-27|Qualcomm Incorporated|Methods and apparatus for communication of notifications|
US20080113656A1|2006-11-15|2008-05-15|Lg Telecom Ltd.|System and method for updating contents|
US8117555B2|2006-12-07|2012-02-14|Sap Ag|Cooperating widgets|
US9003296B2|2006-12-20|2015-04-07|Yahoo! Inc.|Browser renderable toolbar|
US20080163104A1|2006-12-30|2008-07-03|Tobias Haug|Multiple window handler on display screen|
US7921176B2|2007-01-03|2011-04-05|Madnani Rajkumar R|Mechanism for generating a composite email|
US7907125B2|2007-01-05|2011-03-15|Microsoft Corporation|Recognizing multiple input point gestures|
US7924271B2|2007-01-05|2011-04-12|Apple Inc.|Detecting gestures on multi-event sensitive devices|
US7956847B2|2007-01-05|2011-06-07|Apple Inc.|Gestures for controlling, manipulating, and editing of media files using touch sensitive devices|
US7877707B2|2007-01-06|2011-01-25|Apple Inc.|Detecting and interpreting real-world and security gestures on touch and hover sensitive devices|
US8689132B2|2007-01-07|2014-04-01|Apple Inc.|Portable electronic device, method, and graphical user interface for displaying electronic documents and lists|
US7671756B2|2007-01-07|2010-03-02|Apple Inc.|Portable electronic device with alert silencing|
US8091045B2|2007-01-07|2012-01-03|Apple Inc.|System and method for managing lists|
US20080168382A1|2007-01-07|2008-07-10|Louch John O|Dashboards, Widgets and Devices|
US20080222545A1|2007-01-07|2008-09-11|Lemay Stephen O|Portable Electronic Device with a Global Setting User Interface|
US7469381B2|2007-01-07|2008-12-23|Apple Inc.|List scrolling and document translation, scaling, and rotation on a touch-screen display|
US20080168402A1|2007-01-07|2008-07-10|Christopher Blumenberg|Application Programming Interfaces for Gesture Operations|
US8082523B2|2007-01-07|2011-12-20|Apple Inc.|Portable electronic device with graphical user interface supporting application switching|
US7791598B2|2007-01-10|2010-09-07|Microsoft Corporation|Hybrid pen mouse user input device|
US20080172609A1|2007-01-11|2008-07-17|Nokia Corporation|Multiple application handling|
US20080182628A1|2007-01-26|2008-07-31|Matthew Lee|System and method for previewing themes|
US20080180399A1|2007-01-31|2008-07-31|Tung Wan Cheng|Flexible Multi-touch Screen|
US8601370B2|2007-01-31|2013-12-03|Blackberry Limited|System and method for organizing icons for applications on a mobile device|
KR20080073868A|2007-02-07|2008-08-12|엘지전자 주식회사|Terminal and method for displaying menu|
US7737979B2|2007-02-12|2010-06-15|Microsoft Corporation|Animated transitions for data visualization|
US7853240B2|2007-02-15|2010-12-14|Research In Motion Limited|Emergency number selection for mobile communications device|
KR101426718B1|2007-02-15|2014-08-05|삼성전자주식회사|Apparatus and method for displaying of information according to touch event in a portable terminal|
US8078969B2|2007-03-05|2011-12-13|Shutterfly, Inc.|User interface for creating image collage|
US20080222273A1|2007-03-07|2008-09-11|Microsoft Corporation|Adaptive rendering of web pages on mobile devices using imaging technology|
US8352881B2|2007-03-08|2013-01-08|International Business Machines Corporation|Method, apparatus and program storage device for providing customizable, immediate and radiating menus for accessing applications and actions|
US8255812B1|2007-03-15|2012-08-28|Google Inc.|Embedding user-selected content feed items in a webpage|
US20080242362A1|2007-03-26|2008-10-02|Helio, Llc|Rapid Content Association Methods|
KR101344265B1|2007-04-17|2013-12-24|삼성전자주식회사|Method for displaying human relations and mobile terminal thereof|
US7884805B2|2007-04-17|2011-02-08|Sony Ericsson Mobile Communications Ab|Using touches to transfer information between devices|
TWI418200B|2007-04-20|2013-12-01|Lg Electronics Inc|Mobile terminal and screen displaying method thereof|
US20080301104A1|2007-06-01|2008-12-04|Kendall Gregory Lockhart|System and method for implementing enhanced search functionality|
US8381122B2|2007-06-08|2013-02-19|Apple Inc.|Multi-dimensional application environment|
US9740386B2|2007-06-13|2017-08-22|Apple Inc.|Speed/positional mode translations|
US8923507B2|2007-06-20|2014-12-30|Microsoft Corporation|Alpha character support and translation in dialer|
US8059101B2|2007-06-22|2011-11-15|Apple Inc.|Swipe gestures for touch screen keyboards|
US20080316177A1|2007-06-22|2008-12-25|Kuo-Hwa Tseng|Mouse-type mobile phone|
US8065628B2|2007-06-25|2011-11-22|Microsoft Corporation|Dynamic user interface for previewing live content|
CN101971599B|2007-06-27|2016-01-20|卡伦诺尔斯企业私人有限公司|The method communicated, system and product|
JP5133001B2|2007-06-28|2013-01-30|京セラ株式会社|Portable electronic device and display method in the same device|
US8762880B2|2007-06-29|2014-06-24|Microsoft Corporation|Exposing non-authoring features through document status information in an out-space user interface|
US9772751B2|2007-06-29|2017-09-26|Apple Inc.|Using gestures to slide between user interfaces|
US7707205B2|2007-07-05|2010-04-27|Sony Ericsson Mobile Communications Ab|Apparatus and method for locating a target item in a list|
US20120229473A1|2007-07-17|2012-09-13|Airgini Group, Inc.|Dynamic Animation in a Mobile Device|
KR20090011314A|2007-07-25|2009-02-02|삼성전자주식회사|Mobile terminal and sim card displaying method thereof|
US9489216B2|2007-07-26|2016-11-08|Sap Se|Active tiled user interface|
US7783597B2|2007-08-02|2010-08-24|Abaca Technology Corporation|Email filtering using recipient reputation|
JP5046158B2|2007-08-10|2012-10-10|インターナショナル・ビジネス・マシーンズ・コーポレーション|Apparatus and method for detecting characteristics of an e-mail message|
US20080301046A1|2007-08-10|2008-12-04|Christian John Martinez|Methods and systems for making a payment and/or a donation via a network, such as the Internet, using a drag and drop user interface|
US7877687B2|2007-08-16|2011-01-25|Yahoo! Inc.|Persistent visual media player|
KR101430445B1|2007-08-20|2014-08-14|엘지전자 주식회사|Terminal having function for controlling screen size and program recording medium|
US20090051671A1|2007-08-22|2009-02-26|Jason Antony Konstas|Recognizing the motion of two or more touches on a touch-sensing surface|
WO2009029296A1|2007-08-31|2009-03-05|At & T Mobility Ii Llc|Enhanced messaging with language translation feature|
US9477395B2|2007-09-04|2016-10-25|Apple Inc.|Audio file interface|
US11126321B2|2007-09-04|2021-09-21|Apple Inc.|Application menu user interface|
US20090070673A1|2007-09-06|2009-03-12|Guy Barkan|System and method for presenting multimedia content and application interface|
US20090077649A1|2007-09-13|2009-03-19|Soft Trust, Inc.|Secure messaging system and method|
US8098235B2|2007-09-28|2012-01-17|Immersion Corporation|Multi-touch device having dynamic haptic effects|
US9177317B2|2007-09-28|2015-11-03|Bank Of America Corporation|System and method for consumer protection|
US8094105B2|2007-09-28|2012-01-10|Motorola Mobility, Inc.|Navigation for a non-traditionally shaped liquid crystal display for mobile handset devices|
EP2045700A1|2007-10-04|2009-04-08|LG Electronics Inc.|Menu display method for a mobile communication terminal|
KR20140102762A|2007-10-05|2014-08-22|지브이비비 홀딩스 에스.에이.알.엘.|Pointer controlling apparatus|
US7983718B1|2007-10-11|2011-07-19|Sprint Spectrum L.P.|Wireless phones with keys displaying image files|
US20090109243A1|2007-10-25|2009-04-30|Nokia Corporation|Apparatus and method for zooming objects on a display|
US8275398B2|2007-11-02|2012-09-25|Hewlett-Packard Development Company, L.P.|Message addressing techniques for a mobile computing device|
US7992104B2|2007-11-13|2011-08-02|Microsoft Corporation|Viewing data|
US8745513B2|2007-11-29|2014-06-03|Sony Corporation|Method and apparatus for use in accessing content|
US8020780B2|2007-11-30|2011-09-20|Honeywell International Inc.|Thermostatic control system having a configurable lock|
US20090140986A1|2007-11-30|2009-06-04|Nokia Corporation|Method, apparatus and computer program product for transferring files between devices via drag and drop|
US20090146962A1|2007-12-05|2009-06-11|Nokia Corporation|Mobile communication terminal and method|
US8212784B2|2007-12-13|2012-07-03|Microsoft Corporation|Selection and display of media associated with a geographic area based on gesture input|
US20090164888A1|2007-12-19|2009-06-25|Thomas Phan|Automated Content-Based Adjustment of Formatting and Application Behavior|
JP4605478B2|2007-12-19|2011-01-05|ソニー株式会社|Information processing apparatus, display control method, and display control program|
KR20090066368A|2007-12-20|2009-06-24|삼성전자주식회사|Portable terminal having touch screen and method for performing function thereof|
US20090164928A1|2007-12-21|2009-06-25|Nokia Corporation|Method, apparatus and computer program product for providing an improved user interface|
US8515397B2|2007-12-24|2013-08-20|Qualcomm Incorporation|Time and location based theme of mobile telephones|
US9372576B2|2008-01-04|2016-06-21|Apple Inc.|Image jaggedness filter for determining whether to perform baseline calculations|
US8171432B2|2008-01-06|2012-05-01|Apple Inc.|Touch screen device, method, and graphical user interface for displaying and selecting application options|
US20090182788A1|2008-01-14|2009-07-16|Zenbe, Inc.|Apparatus and method for customized email and data management|
WO2009093241A2|2008-01-23|2009-07-30|N-Trig Ltd.|Graphical object manipulation with a touch sensitive screen|
US8677285B2|2008-02-01|2014-03-18|Wimm Labs, Inc.|User interface of a small touch sensitive display for an electronic data and communication device|
US8356258B2|2008-02-01|2013-01-15|Microsoft Corporation|Arranging display areas utilizing enhanced window states|
US9612847B2|2008-02-05|2017-04-04|Microsoft Technology Licensing, Llc|Destination list associated with an application launcher|
US8910299B2|2008-02-08|2014-12-09|Steven Charles Michalske|Emergency information access on portable electronic devices|
US9772689B2|2008-03-04|2017-09-26|Qualcomm Incorporated|Enhanced gesture-based image manipulation|
US8205157B2|2008-03-04|2012-06-19|Apple Inc.|Methods and graphical user interfaces for conducting searches on a portable multifunction device|
JP2009245423A|2008-03-13|2009-10-22|Panasonic Corp|Information device and window display method|
US8327286B2|2008-03-13|2012-12-04|Microsoft Corporation|Unifying application launchers and switchers|
US9269059B2|2008-03-25|2016-02-23|Qualcomm Incorporated|Apparatus and methods for transport optimization for widget content delivery|
US20090249257A1|2008-03-31|2009-10-01|Nokia Corporation|Cursor navigation assistance|
TWI381304B|2008-04-22|2013-01-01|Htc Corp|Method and apparatus for adjusting display area of user interface and recoding medium using the same|
JP4171770B1|2008-04-24|2008-10-29|任天堂株式会社|Object display order changing program and apparatus|
US8174503B2|2008-05-17|2012-05-08|David H. Cain|Touch-based authentication of a mobile device through user generated pattern creation|
US8296670B2|2008-05-19|2012-10-23|Microsoft Corporation|Accessing a menu utilizing a drag-operation|
US8296684B2|2008-05-23|2012-10-23|Hewlett-Packard Development Company, L.P.|Navigating among activities in a computing device|
US8375336B2|2008-05-23|2013-02-12|Microsoft Corporation|Panning content utilizing a drag operation|
US8683362B2|2008-05-23|2014-03-25|Qualcomm Incorporated|Card metaphor for activities in a computing device|
CA2725542C|2008-05-28|2016-12-13|Google Inc.|Motion-controlled views on mobile computing devices|
EP2129090B1|2008-05-29|2016-06-15|LG Electronics Inc.|Mobile terminal and display control method thereof|
JP5164675B2|2008-06-04|2013-03-21|キヤノン株式会社|User interface control method, information processing apparatus, and program|
US8099332B2|2008-06-06|2012-01-17|Apple Inc.|User interface for application management for a mobile device|
US8135392B2|2008-06-06|2012-03-13|Apple Inc.|Managing notification service connections and displaying icon badges|
US8477139B2|2008-06-09|2013-07-02|Apple Inc.|Touch screen device, method, and graphical user interface for manipulating three-dimensional virtual objects|
KR101477743B1|2008-06-16|2014-12-31|삼성전자 주식회사|Terminal and method for performing function thereof|
US9092053B2|2008-06-17|2015-07-28|Apple Inc.|Systems and methods for adjusting a display based on the user's position|
GB0811196D0|2008-06-18|2008-07-23|Skype Ltd|Searching method and apparatus|
JP2010003098A|2008-06-20|2010-01-07|Konica Minolta Business Technologies Inc|Input device, operation acceptance method and operation acceptance program|
US8154524B2|2008-06-24|2012-04-10|Microsoft Corporation|Physics simulation-based interaction for surface computing|
US20090322760A1|2008-06-26|2009-12-31|Microsoft Corporation|Dynamic animation scheduling|
US20090327969A1|2008-06-27|2009-12-31|Microsoft Corporation|Semantic zoom in a virtual three-dimensional graphical user interface|
US8150017B2|2008-07-11|2012-04-03|Verizon Patent And Licensing Inc.|Phone dialer with advanced search feature and associated method of searching a directory|
TW201005599A|2008-07-18|2010-02-01|Asustek Comp Inc|Touch-type mobile computing device and control method of the same|
KR20100010072A|2008-07-22|2010-02-01|엘지전자 주식회사|Controlling method of user interface for multitasking of mobile devices|
US8390577B2|2008-07-25|2013-03-05|Intuilab|Continuous recognition of multi-touch gestures|
WO2010015070A1|2008-08-07|2010-02-11|Research In Motion Limited|System and method for providing content on a mobile device by controlling an application independent of user action|
US8924892B2|2008-08-22|2014-12-30|Fuji Xerox Co., Ltd.|Multiple selection on devices with many gestures|
JP4636141B2|2008-08-28|2011-02-23|ソニー株式会社|Information processing apparatus and method, and program|
US20100058248A1|2008-08-29|2010-03-04|Johnson Controls Technology Company|Graphical user interfaces for building management systems|
US20100070931A1|2008-09-15|2010-03-18|Sony Ericsson Mobile Communications Ab|Method and apparatus for selecting an object|
KR101548958B1|2008-09-18|2015-09-01|삼성전자주식회사|A method for operating control in mobile terminal with touch screen and apparatus thereof.|
US8352864B2|2008-09-19|2013-01-08|Cisco Technology, Inc.|Method of operating a design generator for personalization of electronic devices|
US8296658B2|2008-09-19|2012-10-23|Cisco Technology, Inc.|Generator for personalization of electronic devices|
US8595371B2|2008-09-19|2013-11-26|Samsung Electronics Co., Ltd.|Sending a remote user interface|
US20100075628A1|2008-09-19|2010-03-25|Verizon Data Services Llc|Method and apparatus for transmitting authenticated emergency messages|
US8600446B2|2008-09-26|2013-12-03|Htc Corporation|Mobile device interface with dual windows|
US8176438B2|2008-09-26|2012-05-08|Microsoft Corporation|Multi-modal interaction for a screen magnifier|
US20100079413A1|2008-09-29|2010-04-01|Denso Corporation|Control device|
JP5345129B2|2008-09-29|2013-11-20|パナソニック株式会社|User interface device, user interface method, and recording medium|
US20100087173A1|2008-10-02|2010-04-08|Microsoft Corporation|Inter-threading Indications of Different Types of Communication|
US20100087169A1|2008-10-02|2010-04-08|Microsoft Corporation|Threading together messages with multiple common participants|
US9015616B2|2008-10-22|2015-04-21|Google Inc.|Search initiation|
US20100105424A1|2008-10-23|2010-04-29|Smuga Michael A|Mobile Communications Device User Interface|
US20100105441A1|2008-10-23|2010-04-29|Chad Aron Voss|Display Size of Representations of Content|
US8086275B2|2008-10-23|2011-12-27|Microsoft Corporation|Alternative inputs of a mobile communications device|
US8411046B2|2008-10-23|2013-04-02|Microsoft Corporation|Column organization of content|
TW201023026A|2008-10-23|2010-06-16|Microsoft Corp|Location-based display characteristics in a user interface|
US8385952B2|2008-10-23|2013-02-26|Microsoft Corporation|Mobile communications device user interface|
US8477103B2|2008-10-26|2013-07-02|Microsoft Corporation|Multi-touch object inertia simulation|
US8108623B2|2008-10-26|2012-01-31|Microsoft Corporation|Poll based cache event notifications in a distributed cache|
US20100107067A1|2008-10-27|2010-04-29|Nokia Corporation|Input on touch based user interfaces|
KR101029627B1|2008-10-31|2011-04-15|에스케이텔레시스 주식회사|Method of operating functions of mobile terminal with touch screen and apparatus thereof|
WO2010055197A1|2008-11-11|2010-05-20|Nokia Corporation|Method and apparatus for managing advertising-enabled applications|
KR20100056350A|2008-11-18|2010-05-27|황선원|Method and apparatus for automatically outputting updated music letter voice and picture on initial display window of the portable display devices|
US8302026B2|2008-11-28|2012-10-30|Microsoft Corporation|Multi-panel user interface|
US20100146437A1|2008-12-04|2010-06-10|Microsoft Corporation|Glanceable animated notifications on a locked device|
US20100145675A1|2008-12-04|2010-06-10|Microsoft Corporation|User interface having customizable text strings|
US8942767B2|2008-12-19|2015-01-27|Verizon Patent And Licensing Inc.|Communications convergence and user interface systems, apparatuses, and methods|
US8331992B2|2008-12-19|2012-12-11|Verizon Patent And Licensing Inc.|Interactive locked state mobile communication device|
US8443303B2|2008-12-22|2013-05-14|Verizon Patent And Licensing Inc.|Gesture-based navigation|
US8799806B2|2008-12-31|2014-08-05|Verizon Patent And Licensing Inc.|Tabbed content view on a touch-screen device|
US8291348B2|2008-12-31|2012-10-16|Hewlett-Packard Development Company, L.P.|Computing device and method for selecting display regions responsive to non-discrete directional input actions and intelligent content analysis|
US20100175029A1|2009-01-06|2010-07-08|General Electric Company|Context switching zooming user interface|
US8499251B2|2009-01-07|2013-07-30|Microsoft Corporation|Virtual page turn|
US8433998B2|2009-01-16|2013-04-30|International Business Machines Corporation|Tool and method for annotating an event map, and collaborating using the annotated event map|
US8750906B2|2009-02-20|2014-06-10|T-Mobile Usa, Inc.|Dynamic elements on a map within a mobile device, such as elements that facilitate communication between users|
US8819570B2|2009-03-27|2014-08-26|Zumobi, Inc|Systems, methods, and computer program products displaying interactive elements on a canvas|
US8355698B2|2009-03-30|2013-01-15|Microsoft Corporation|Unlock screen|
US8175653B2|2009-03-30|2012-05-08|Microsoft Corporation|Chromeless user interface|
US20100248741A1|2009-03-30|2010-09-30|Nokia Corporation|Method and apparatus for illustrative representation of a text communication|
US8238876B2|2009-03-30|2012-08-07|Microsoft Corporation|Notifications|
KR20100114572A|2009-04-16|2010-10-26|삼성전자주식회사|Method for displaying contents of terminal having touch screen and apparatus thereof|
EP2304543A1|2009-04-29|2011-04-06|Torch Mobile Inc.|Software-based asynchronous tiled backingstore|
US20100281409A1|2009-04-30|2010-11-04|Nokia Corporation|Apparatus and method for handling notifications within a communications device|
US8669945B2|2009-05-07|2014-03-11|Microsoft Corporation|Changing of list views on mobile device|
US8368707B2|2009-05-18|2013-02-05|Apple Inc.|Memory management based on automatic full-screen detection|
KR101620874B1|2009-05-19|2016-05-13|삼성전자주식회사|Searching Method of a List And Portable Device using the same|
US20110004845A1|2009-05-19|2011-01-06|Intelliborn Corporation|Method and System For Notifying A User of An Event Or Information Using Motion And Transparency On A Small Screen Display|
US8269736B2|2009-05-22|2012-09-18|Microsoft Corporation|Drop target gestures|
US8836648B2|2009-05-27|2014-09-16|Microsoft Corporation|Touch pull-in gesture|
US9298336B2|2009-05-28|2016-03-29|Apple Inc.|Rotation smoothing of a user interface|
US20100302176A1|2009-05-29|2010-12-02|Nokia Corporation|Zoom-in functionality|
US8225193B1|2009-06-01|2012-07-17|Symantec Corporation|Methods and systems for providing workspace navigation with a tag cloud|
US8621387B2|2009-06-08|2013-12-31|Apple Inc.|User interface for multiple display regions|
KR101561703B1|2009-06-08|2015-10-30|엘지전자 주식회사|The method for executing menu and mobile terminal using the same|
KR101649098B1|2009-06-30|2016-08-19|삼성전자주식회사|Apparatus and method for rendering using sensor in portable terminal|
US8239781B2|2009-06-30|2012-08-07|Sap Ag|Drag and drop of an application component to desktop|
US20110004839A1|2009-07-02|2011-01-06|Derek Cha|User-customized computer display method|
JP2011028524A|2009-07-24|2011-02-10|Toshiba Corp|Information processing apparatus, program and pointing method|
US20110029904A1|2009-07-30|2011-02-03|Adam Miles Smith|Behavior and Appearance of Touch-Optimized User Interface Elements for Controlling Computer Function|
US8656314B2|2009-07-30|2014-02-18|Lenovo Pte. Ltd.|Finger touch gesture for joining and unjoining discrete touch objects|
US8521809B2|2009-07-31|2013-08-27|Z2Live, Inc.|Mobile device notification controls system and method|
KR101484826B1|2009-08-25|2015-01-20|구글 잉크.|Direct manipulation gestures|
US8624933B2|2009-09-25|2014-01-07|Apple Inc.|Device, method, and graphical user interface for scrolling a multi-section document|
US8766928B2|2009-09-25|2014-07-01|Apple Inc.|Device, method, and graphical user interface for manipulating user interface objects|
TW201112074A|2009-09-30|2011-04-01|Higgstec Inc|Touch gesture detecting method of a touch panel|
CA2681879A1|2009-10-07|2011-04-07|Research In Motion Limited|A method of controlling touch input on a touch-sensitive display when a display element is active and a portable electronic device configured for the same|
US20110087988A1|2009-10-12|2011-04-14|Johnson Controls Technology Company|Graphical control elements for building management systems|
US8499253B2|2009-10-13|2013-07-30|Google Inc.|Individualized tab audio controls|
KR101701492B1|2009-10-16|2017-02-14|삼성전자주식회사|Terminal and method for displaying data thereof|
US9104275B2|2009-10-20|2015-08-11|Lg Electronics Inc.|Mobile terminal to display an object on a perceived 3D space|
US8261212B2|2009-10-20|2012-09-04|Microsoft Corporation|Displaying GUI elements on natural user interfaces|
US8677284B2|2009-11-04|2014-03-18|Alpine Electronics, Inc.|Method and apparatus for controlling and displaying contents in a user interface|
US20110113363A1|2009-11-10|2011-05-12|James Anthony Hunt|Multi-Mode User Interface|
US8839128B2|2009-11-25|2014-09-16|Cooliris, Inc.|Gallery application for content viewing|
KR101725887B1|2009-12-21|2017-04-11|삼성전자주식회사|Method and apparatus for searching contents in touch screen device|
US20110157027A1|2009-12-30|2011-06-30|Nokia Corporation|Method and Apparatus for Performing an Operation on a User Interface Object|
US9189500B2|2009-12-31|2015-11-17|Verizon Patent And Licensing Inc.|Graphical flash view of documents for data navigation on a touch-screen device|
US8786559B2|2010-01-06|2014-07-22|Apple Inc.|Device, method, and graphical user interface for manipulating tables using multi-contact gestures|
WO2011088131A1|2010-01-12|2011-07-21|Crane Merchandising Systems, Inc.|Mechanism for a vending machine graphical user interface utilizing xml for a versatile customer experience|
US9542097B2|2010-01-13|2017-01-10|Lenovo Pte. Ltd.|Virtual touchpad for a touch device|
EP2354914A1|2010-01-19|2011-08-10|LG Electronics Inc.|Mobile terminal and control method thereof|
US8930841B2|2010-02-15|2015-01-06|Motorola Mobility Llc|Methods and apparatus for a user interface configured to display event information|
US20110231796A1|2010-02-16|2011-09-22|Jose Manuel Vigil|Methods for navigating a touch screen device in conjunction with gestures|
US8473870B2|2010-02-25|2013-06-25|Microsoft Corporation|Multi-screen hold and drag gesture|
US20110209089A1|2010-02-25|2011-08-25|Hinckley Kenneth P|Multi-screen object-hold and page-change gesture|
US9075522B2|2010-02-25|2015-07-07|Microsoft Technology Licensing, Llc|Multi-screen bookmark hold gesture|
US8539384B2|2010-02-25|2013-09-17|Microsoft Corporation|Multi-screen pinch and expand gestures|
US8751970B2|2010-02-25|2014-06-10|Microsoft Corporation|Multi-screen synchronous slide gesture|
US9454304B2|2010-02-25|2016-09-27|Microsoft Technology Licensing, Llc|Multi-screen dual tap gesture|
US20110209101A1|2010-02-25|2011-08-25|Hinckley Kenneth P|Multi-screen pinch-to-pocket gesture|
US8589815B2|2010-03-10|2013-11-19|Microsoft Corporation|Control of timing for animations in dynamic icons|
US9170708B2|2010-04-07|2015-10-27|Apple Inc.|Device, method, and graphical user interface for managing folders|
US9052925B2|2010-04-07|2015-06-09|Apple Inc.|Device, method, and graphical user interface for managing concurrently open software applications|
FR2959037A1|2010-04-14|2011-10-21|Orange Vallee|METHOD FOR CREATING A MEDIA SEQUENCE BY COHERENT GROUPS OF MEDIA FILES|
US8957920B2|2010-06-25|2015-02-17|Microsoft Corporation|Alternative semantics for zoom operations in a zoomable scene|
US8639747B2|2010-07-01|2014-01-28|Red Hat, Inc.|System and method for providing a cloud computing graphical user interface|
US8285258B2|2010-07-07|2012-10-09|Research In Motion Limited|Pushed content notification and display|
US20120050332A1|2010-08-25|2012-03-01|Nokia Corporation|Methods and apparatuses for facilitating content navigation|
US10140301B2|2010-09-01|2018-11-27|Apple Inc.|Device, method, and graphical user interface for selecting and using sets of media player controls|
EP2625660A4|2010-10-05|2014-06-11|Centric Software Inc|Interactive collection book for mobile devices|
US20120102433A1|2010-10-20|2012-04-26|Steven Jon Falkenburg|Browser Icon Management|
US20120151397A1|2010-12-08|2012-06-14|Tavendo Gmbh|Access to an electronic object collection via a plurality of views|
US9239674B2|2010-12-17|2016-01-19|Nokia Technologies Oy|Method and apparatus for providing different user interface effects for different implementation characteristics of a touch event|
US20120159395A1|2010-12-20|2012-06-21|Microsoft Corporation|Application-launching interface for multiple modes|
US8689123B2|2010-12-23|2014-04-01|Microsoft Corporation|Application reporting in an application-selectable user interface|
US8612874B2|2010-12-23|2013-12-17|Microsoft Corporation|Presenting an application change through a tile|
US20120174029A1|2010-12-30|2012-07-05|International Business Machines Corporation|Dynamically magnifying logical segments of a view|
US9423951B2|2010-12-31|2016-08-23|Microsoft Technology Licensing, Llc|Content-based snap point|
US9311061B2|2011-02-10|2016-04-12|International Business Machines Corporation|Designing task execution order based on location of the task icons within a graphical user interface|
US9104288B2|2011-03-08|2015-08-11|Nokia Technologies Oy|Method and apparatus for providing quick access to media functions from a locked screen|
US9383917B2|2011-03-28|2016-07-05|Microsoft Technology Licensing, Llc|Predictive tiling|
US8893033B2|2011-05-27|2014-11-18|Microsoft Corporation|Application notifications|
US20120304118A1|2011-05-27|2012-11-29|Donahue Tyler J|Application Notification Display|
US9158445B2|2011-05-27|2015-10-13|Microsoft Technology Licensing, Llc|Managing an immersive interface in a multi-application immersive environment|
US20120299968A1|2011-05-27|2012-11-29|Tsz Yan Wong|Managing an immersive interface in a multi-application immersive environment|
US9104440B2|2011-05-27|2015-08-11|Microsoft Technology Licensing, Llc|Multi-application environment|
US20120304068A1|2011-05-27|2012-11-29|Nazia Zaman|Presentation format for an application tile|
US20120304117A1|2011-05-27|2012-11-29|Donahue Tyler J|Application Notification Tags|
US20120304113A1|2011-05-27|2012-11-29|Patten Michael J|Gesture-based content-object zooming|
US9104307B2|2011-05-27|2015-08-11|Microsoft Technology Licensing, Llc|Multi-application environment|
US9728164B2|2011-05-31|2017-08-08|Lenovo Pte. Ltd.|Moving a tile across multiple workspaces|
US8694603B2|2011-06-20|2014-04-08|International Business Machines Corporation|Geospatial visualization performance improvement for contiguous polylines with similar dynamic characteristics|
US8687023B2|2011-08-02|2014-04-01|Microsoft Corporation|Cross-slide gesture to select and rearrange|
US8700999B2|2011-08-15|2014-04-15|Google Inc.|Carousel user interface for document management|
US20130057587A1|2011-09-01|2013-03-07|Microsoft Corporation|Arranging tiles|
US8922575B2|2011-09-09|2014-12-30|Microsoft Corporation|Tile cache|
US10353566B2|2011-09-09|2019-07-16|Microsoft Technology Licensing, Llc|Semantic zoom animations|
US20130067398A1|2011-09-09|2013-03-14|Theresa B. Pittappilly|Semantic Zoom|
US20130067390A1|2011-09-09|2013-03-14|Paul J. Kwiatkowski|Programming Interface for Semantic Zoom|
US20130067420A1|2011-09-09|2013-03-14|Theresa B. Pittappilly|Semantic Zoom Gestures|
US20130067412A1|2011-09-09|2013-03-14|Microsoft Corporation|Grouping selectable tiles|
US9557909B2|2011-09-09|2017-01-31|Microsoft Technology Licensing, Llc|Semantic zoom linguistic helpers|
US8933952B2|2011-09-10|2015-01-13|Microsoft Corporation|Pre-rendering new content for an application-selectable user interface|
US9146670B2|2011-09-10|2015-09-29|Microsoft Technology Licensing, Llc|Progressively indicating new content in an application-selectable user interface|
US9244802B2|2011-09-10|2016-01-26|Microsoft Technology Licensing, Llc|Resource user interface|
US8243102B1|2011-10-12|2012-08-14|Google Inc.|Derivative-based selection of zones for banded map display|US7093201B2|2001-09-06|2006-08-15|Danger, Inc.|Loop menu navigation apparatus and method|
US8225231B2|2005-08-30|2012-07-17|Microsoft Corporation|Aggregation of PC settings|
US8411046B2|2008-10-23|2013-04-02|Microsoft Corporation|Column organization of content|
US8086275B2|2008-10-23|2011-12-27|Microsoft Corporation|Alternative inputs of a mobile communications device|
US8238876B2|2009-03-30|2012-08-07|Microsoft Corporation|Notifications|
US8175653B2|2009-03-30|2012-05-08|Microsoft Corporation|Chromeless user interface|
US10397639B1|2010-01-29|2019-08-27|Sitting Man, Llc|Hot key systems and methods|
US9715332B1|2010-08-26|2017-07-25|Cypress Lake Software, Inc.|Methods, systems, and computer program products for navigating between visual components|
US8780130B2|2010-11-30|2014-07-15|Sitting Man, Llc|Methods, systems, and computer program products for binding attributes between visual components|
US20120159395A1|2010-12-20|2012-06-21|Microsoft Corporation|Application-launching interface for multiple modes|
US20120159383A1|2010-12-20|2012-06-21|Microsoft Corporation|Customization of an immersive environment|
US8689123B2|2010-12-23|2014-04-01|Microsoft Corporation|Application reporting in an application-selectable user interface|
US9436685B2|2010-12-23|2016-09-06|Microsoft Technology Licensing, Llc|Techniques for electronic aggregation of information|
US8612874B2|2010-12-23|2013-12-17|Microsoft Corporation|Presenting an application change through a tile|
US9679404B2|2010-12-23|2017-06-13|Microsoft Technology Licensing, Llc|Techniques for dynamic layout of presentation tiles on a grid|
US9423951B2|2010-12-31|2016-08-23|Microsoft Technology Licensing, Llc|Content-based snap point|
US9465440B2|2011-01-06|2016-10-11|Blackberry Limited|Electronic device and method of displaying information in response to a gesture|
US9471145B2|2011-01-06|2016-10-18|Blackberry Limited|Electronic device and method of displaying information in response to a gesture|
US9423878B2|2011-01-06|2016-08-23|Blackberry Limited|Electronic device and method of displaying information in response to a gesture|
US9383917B2|2011-03-28|2016-07-05|Microsoft Technology Licensing, Llc|Predictive tiling|
US9715485B2|2011-03-28|2017-07-25|Microsoft Technology Licensing, Llc|Techniques for electronic aggregation of information|
US20120272180A1|2011-04-20|2012-10-25|Nokia Corporation|Method and apparatus for providing content flipping based on a scrolling operation|
US9104307B2|2011-05-27|2015-08-11|Microsoft Technology Licensing, Llc|Multi-application environment|
US9658766B2|2011-05-27|2017-05-23|Microsoft Technology Licensing, Llc|Edge gesture|
US9158445B2|2011-05-27|2015-10-13|Microsoft Technology Licensing, Llc|Managing an immersive interface in a multi-application immersive environment|
US9104440B2|2011-05-27|2015-08-11|Microsoft Technology Licensing, Llc|Multi-application environment|
US8893033B2|2011-05-27|2014-11-18|Microsoft Corporation|Application notifications|
US8687023B2|2011-08-02|2014-04-01|Microsoft Corporation|Cross-slide gesture to select and rearrange|
US20130057587A1|2011-09-01|2013-03-07|Microsoft Corporation|Arranging tiles|
US8922575B2|2011-09-09|2014-12-30|Microsoft Corporation|Tile cache|
US9557909B2|2011-09-09|2017-01-31|Microsoft Technology Licensing, Llc|Semantic zoom linguistic helpers|
US10353566B2|2011-09-09|2019-07-16|Microsoft Technology Licensing, Llc|Semantic zoom animations|
US8933952B2|2011-09-10|2015-01-13|Microsoft Corporation|Pre-rendering new content for an application-selectable user interface|
US9244802B2|2011-09-10|2016-01-26|Microsoft Technology Licensing, Llc|Resource user interface|
US9146670B2|2011-09-10|2015-09-29|Microsoft Technology Licensing, Llc|Progressively indicating new content in an application-selectable user interface|
US8922584B2|2011-09-30|2014-12-30|Frederic Sigal|Method of creating, displaying, and interfacing an infinite navigable media wall|
WO2013089693A1|2011-12-14|2013-06-20|Intel Corporation|Gaze activated content transfer system|
KR101882724B1|2011-12-21|2018-08-27|삼성전자 주식회사|Category Search Method And Portable Device supporting the same|
US9223472B2|2011-12-22|2015-12-29|Microsoft Technology Licensing, Llc|Closing applications|
JP2013152566A|2012-01-24|2013-08-08|Funai Electric Co Ltd|Remote control device|
US9128605B2|2012-02-16|2015-09-08|Microsoft Technology Licensing, Llc|Thumbnail-image selection of applications|
KR20130097266A|2012-02-24|2013-09-03|삼성전자주식회사|Method and apparatus for editing contents view in mobile terminal|
JP6055734B2|2012-09-26|2016-12-27|京セラドキュメントソリューションズ株式会社|Display input device and image forming apparatus having the same|
US9891781B2|2012-10-05|2018-02-13|Htc Corporation|Mobile communications device, non-transitory computer-readable medium and method of navigating between a plurality of different views of home screen of mobile communications device|
US9335913B2|2012-11-12|2016-05-10|Microsoft Technology Licensing, Llc|Cross slide gesture|
US8814683B2|2013-01-22|2014-08-26|Wms Gaming Inc.|Gaming system and methods adapted to utilize recorded player gestures|
CN112215914A|2013-02-23|2021-01-12|高通股份有限公司|System and method for interactive image caricature generation by an electronic device|
US10120540B2|2013-03-14|2018-11-06|Samsung Electronics Co., Ltd.|Visual feedback for user interface navigation on television system|
US10025459B2|2013-03-14|2018-07-17|Airwatch Llc|Gesture-based workflow progression|
US9767076B2|2013-03-15|2017-09-19|Google Inc.|Document scale and position optimization|
US9588675B2|2013-03-15|2017-03-07|Google Inc.|Document scale and position optimization|
US20140298258A1|2013-03-28|2014-10-02|Microsoft Corporation|Switch List Interactions|
US20140298219A1|2013-03-29|2014-10-02|Microsoft Corporation|Visual Selection and Grouping|
CN103246449B|2013-04-16|2016-03-02|广东欧珀移动通信有限公司|The screen unlock method of mobile terminal and mobile terminal|
DE102013009009A1|2013-05-17|2014-12-04|Elektrobit Automotive Gmbh|System and method for data selection by means of a touch-sensitive surface|
US9450952B2|2013-05-29|2016-09-20|Microsoft Technology Licensing, Llc|Live tiles without application-code execution|
US20140372923A1|2013-06-14|2014-12-18|Microsoft Corporation|High Performance Touch Drag and Drop|
JP6218451B2|2013-06-18|2017-10-25|シャープ株式会社|Program execution device|
USD732561S1|2013-06-25|2015-06-23|Microsoft Corporation|Display screen with graphical user interface|
JP2015022567A|2013-07-19|2015-02-02|富士ゼロックス株式会社|Information processing apparatus and information processing program|
JP5505550B1|2013-08-06|2014-05-28|富士ゼロックス株式会社|Image display apparatus and program|
CN105474150B|2013-09-02|2018-09-21|索尼公司|Information processing unit, information processing method and program|
US9176657B2|2013-09-14|2015-11-03|Changwat TUMWATTANA|Gesture-based selection and manipulation method|
US20150286391A1|2014-04-08|2015-10-08|Olio Devices, Inc.|System and method for smart watch navigation|
EP3058448A4|2013-10-18|2017-04-12|Citrix Systems Inc.|Providing enhanced message management user interfaces|
US20150128095A1|2013-11-07|2015-05-07|Tencent TechnologyCompany Limited|Method, device and computer system for performing operations on objects in an object list|
USD767590S1|2013-12-30|2016-09-27|Nikolai Joukov|Display screen or portion thereof with graphical user interface for displaying software cells|
JP5924554B2|2014-01-06|2016-05-25|コニカミノルタ株式会社|Object stop position control method, operation display device, and program|
WO2015149347A1|2014-04-04|2015-10-08|Microsoft Technology Licensing, Llc|Expandable application representation|
KR102107275B1|2014-04-10|2020-05-06|마이크로소프트 테크놀로지 라이센싱, 엘엘씨|Collapsible shell cover for computing device|
EP3129847A4|2014-04-10|2017-04-19|Microsoft Technology Licensing, LLC|Slider cover for computing device|
CN106170747A|2014-04-14|2016-11-30|夏普株式会社|Input equipment and the control method of input equipment|
US10089346B2|2014-04-25|2018-10-02|Dropbox, Inc.|Techniques for collapsing views of content items in a graphical user interface|
US9891794B2|2014-04-25|2018-02-13|Dropbox, Inc.|Browsing and selecting content items based on user gestures|
US9547433B1|2014-05-07|2017-01-17|Google Inc.|Systems and methods for changing control functions during an input gesture|
US10656784B2|2014-06-16|2020-05-19|Samsung Electronics Co., Ltd.|Method of arranging icon and electronic device supporting the same|
KR101631966B1|2014-06-19|2016-06-20|엘지전자 주식회사|Mobile terminal and method for controlling the same|
US10592080B2|2014-07-31|2020-03-17|Microsoft Technology Licensing, Llc|Assisted presentation of application windows|
US10254942B2|2014-07-31|2019-04-09|Microsoft Technology Licensing, Llc|Adaptive sizing and positioning of application windows|
US10678412B2|2014-07-31|2020-06-09|Microsoft Technology Licensing, Llc|Dynamic joint dividers for application windows|
US20160070460A1|2014-09-04|2016-03-10|Adobe Systems Incorporated|In situ assignment of image asset attributes|
US10642365B2|2014-09-09|2020-05-05|Microsoft Technology Licensing, Llc|Parametric inertia and APIs|
CN104238944B|2014-09-10|2015-08-26|腾讯科技(深圳)有限公司|A kind of document handling method, device and terminal device|
CN106662891B|2014-10-30|2019-10-11|微软技术许可有限责任公司|Multi-configuration input equipment|
US20160147381A1|2014-11-26|2016-05-26|Blackberry Limited|Electronic device and method of controlling display of information|
US10120848B2|2014-12-09|2018-11-06|Salesforce.Com, Inc.|Methods and systems for applying responsive design to subframes on a web page|
US9883007B2|2015-01-20|2018-01-30|Microsoft Technology Licensing, Llc|Downloading an application to an apparatus|
CN104571871A|2015-01-26|2015-04-29|深圳市中兴移动通信有限公司|Method and system for selecting files|
JP6532372B2|2015-10-06|2019-06-19|キヤノン株式会社|Display control device, control method thereof and program|
US10409465B2|2015-12-08|2019-09-10|International Business Machines Corporation|Selecting areas of content on a touch screen|
JP6624972B2|2016-02-26|2019-12-25|キヤノン株式会社|Method, apparatus, and program for controlling display|
US10386933B2|2016-08-30|2019-08-20|International Business Machines Corporation|Controlling navigation of a visual aid during a presentation|
US10802125B2|2016-10-03|2020-10-13|FLIR Belgium BVBA|Touch-gesture control for side-looking sonar systems|
US11209912B2|2016-12-06|2021-12-28|Rohde & Schwarz Gmbh & Co. Kg|Measuring device and configuration method|
US10579740B2|2016-12-28|2020-03-03|Motorola Solutions, Inc.|System and method for content presentation selection|
KR102316024B1|2017-03-02|2021-10-26|삼성전자주식회사|Display apparatus and user interface displaying method thereof|
US10345957B2|2017-06-21|2019-07-09|Microsoft Technology Licensing, Llc|Proximity selector|
CN107358213B|2017-07-20|2020-02-21|湖南科乐坊教育科技股份有限公司|Method and device for detecting reading habits of children|
DK180470B1|2017-08-31|2021-05-06|Apple Inc|Systems, procedures, and graphical user interfaces for interacting with augmented and virtual reality environments|
DK201870348A1|2018-01-24|2019-10-08|Apple Inc.|Devices, Methods, and Graphical User Interfaces for System-Wide Behavior for 3D Models|
JP2019139679A|2018-02-15|2019-08-22|コニカミノルタ株式会社|Image processing apparatus, screen handling method, and computer program|
US10936281B2|2018-12-19|2021-03-02|International Business Machines Corporation|Automatic slide page progression based on verbal and visual cues|
US11237716B2|2019-10-14|2022-02-01|Sling TV L.L.C.|Devices, systems and processes for facilitating user adaptive progressions through content|
CN110971976B|2019-11-22|2021-08-27|中国联合网络通信集团有限公司|Audio and video file analysis method and device|
US11099729B1|2020-05-29|2021-08-24|Capital One Services, Llc|Methods and systems for displaying content based on a scroll pattern|
法律状态:
2018-10-16| B25A| Requested transfer of rights approved|Owner name: MICROSOFT TECHNOLOGY LICENSING, LLC (US) |
2018-12-11| B06F| Objections, documents and/or translations needed after an examination request according [chapter 6.6 patent gazette]|
2019-11-05| B06U| Preliminary requirement: requests with searches performed by other patent offices: procedure suspended [chapter 6.21 patent gazette]|
2021-03-23| B07A| Technical examination (opinion): publication of technical examination (opinion) [chapter 7.1 patent gazette]|
2021-05-25| B09A| Decision: intention to grant [chapter 9.1 patent gazette]|
2021-07-27| B16A| Patent or certificate of addition of invention granted|Free format text: PRAZO DE VALIDADE: 20 (VINTE) ANOS CONTADOS A PARTIR DE 17/07/2012, OBSERVADAS AS CONDICOES LEGAIS. |
优先权:
申请号 | 申请日 | 专利标题
US13/196,272|US8687023B2|2011-08-02|2011-08-02|Cross-slide gesture to select and rearrange|
US13/196,272|2011-08-02|
PCT/US2012/047091|WO2013019404A1|2011-08-02|2012-07-17|Cross-slide gesture to select and rearrange|
[返回顶部]